Recently I’ve been “[sunsetting][]” old email accounts I don’t really need any more. One of them was a private domain, and was hosted with Fastmail. In most cases I could have set up a simple forwarding rule from my domain registrar to my master email, but this one domain made extensive use of subdomains to filter and “tag” email from services — i.e. user@service.domain.com. Forwarding from the registrar would only catch mail at the top level. Everything else would return an error to the sender1. If I were to move the DNS to Cloudflare, like my other “active” domains, I wouldn’t be able to do even this basic forwarding; I’d have to setup my own mail server to handle the domain.

[sunsetting]:{{site.url}}{% link _posts/2017-09-08-sunsetting-a-decades-old-email-address.markdown %}

Running your own email server, in 2017, is a fool’s errand. I needed to find another way.

After a few evenings research, and weighing the pro’s and con’s of each approach, I settled on using Mailgun to route email2.

Step 1. Mailgun

Create your Mailgun account. Add your domain. Mailgun will encourage you to use a subdomain, but I didn’t. During the setup, you’ll be presented with several DNS records you need to add to Cloudflare – 2 TXT records, 2 MX records, and a CNAME record. Leave this tab open for now.

Step 2. Cloudflare

Add the domain to Cloudflare, if you haven’t already. Modify the DNS records to remove any previous MX records and add in the details Mailgun gave you. To get the subdomain email addresses to work, you also need to add 2 more MX records similar to these:

MX    *.domain.com    mxa.mailgun.org    10
MX    *.domain.com    mxb.mailgun.org    10

For some reason I also needed to add a wildcard A record (*.domain.com) to get things to work correctly. This might be a Cloudflare quirk.

Step 3. Back to Mailgun

Click through to finish adding the domain to your Mailgun account. Depending on DNS propegation timings, you might need to click the “Check DNS Records” button a couple of hours later, to verify the domain (usually it will only be minutes). Under Domain Settings, change Wildcard Domain to On.

Switch to the Routes screen. Create a Route. A “Catch All” type should be fine, but you can check out the Mailgun documentation to define more complex rules. Make sure the Forwarding checkbox is ticked, and enter your master email address as the destination. Set the priority to 10, give the Route a name, and click Create Route.

Use the Test Your Routes box to, well, test the route with a sample email address to make sure it will fire appropriately.

Step 4. There is no Step 4.

At this point you should be done. I needed to wait a few hours for the DNS records to propegate out to other services before test emails would arrive properly. But once DNS did its thing I was receiving email to my master address just like I had been in Fastmail.


  1. GMail, for example, tells you the subdomain doesn’t accept mail, using a nice, clear message. Other services are usually far less helpful. 
  2. Note that Mailgun appears to discourage you from using their service like this. 

I recently decided it was time to consolidate several email accounts, spread across multiple services, to one easily managed account. Some of these have been in use for over a decade. Some are “custom” domains, some are Gmail and other hosted services.

Partly this was for simplifying things — “what account did I use for signing up to that?” — partly it was to reduce the number of services I pay for every year (more on that in another post. Probably.) and partly it was to reduce my online “footprint” for privacy and security reasons. As a welcome side effect it would dramatically reduce the volume of spam I receive!

Importantly, my goal wasn’t to just redirect them all into yet another account. That would just lead to more juggling in the future. My goal was a full purge. I would have one master account and anything old would no longer exist.

Step 1. Identify What Was “To Go”

I had one fairly recent Gmail account which was practically never used, and had a decent address. I quickly settled on this as my new master account. Everything else was on the chopping block from this point on. This included some domains setup at FastMail which used some special setup to let me segregate each service I’d signed up to under its own subdomain address (i.e. user@service.domain.com). More on these domains in a later post.

I checked through my password manager to make sure I’d found all of the email services I had logins for, in case I’d forgotten about any.

Step 2. Secure the Master Account

Buy a Security Key. Use it. I followed these steps to get things setup so I have 2FA through either the key, or the authenticator app on my phone. This lets me avoid the backup SMS codes I’ve never really felt were particularly secure.

I really wish more services supported this setup, instead of limiting you to app + SMS. Or primarily SMS — yes, I’m looking at you, Twitter.

Step 3. Temporarily Redirect Everything to the Master Account

This was so I could categorise and prioritise the mail I was receiving. I applied a filter to incoming mail to tag it with the service it was forwarded from. From here I was able to identify where I had to put the most work in.

Step 4. Update Priority (Non-Email) Services

Mostly this involved going through my 1Password vault and updating anything I felt was important during the first pass. As many of the services used an email address as part of the login, I could search for each address in turn to cut down on the number of sites I’d have to update in each session.

Step 5. Purge Unnecessary Services

While I was updating services it was as good a time as any to delete any accounts which were idle. Using Just Delete Me sped the process up considerably. I think I’d cleared out 100 logins from my 1Password Vault by the end of the first week.

Step 6. Export Email Data

If it’s an old GMail account, you can use Google Takeout to download an archive of your mail. Other services vary, but the easiest way to grab an archive of your mail is to configure a desktop client, synchronise/download to your local computer, and then export from there. This is what I had to do with my Fastmail domains.

Step 7. Tidy-up

We’re in the final stretch now… with all the legwork out of the way, you’ll want to: make sure you’ve updated any friends, family, and contacts who would have the old email account; update any profiles or web pages where you have the email address listed; remove the account from devices, and basically remove as many references to the address as you can.

Step 8. Close the Email Account

This should be the obvious bit, but there might be some caveats… for example, with GMail you need to close the entire Google Account, taking any data in other Google services like YouTube with it. Some services make it harder than others to delete your account. If in doubt, check Just Delete Me, or try a search for “delete account.”

A clear, concise guide on using a hardware Security Key1 with a Gmail account. I didn’t even know it was possible to avoid using SMS as your backup second factor — thanks to this guide I have my Key as my main and the Authenticator app as backup. No SMS involved. (My phone number has since been removed from my Google account)

The official documentation/setup guide should really make this clearer.


  1. I use this simple FIDO/U2F key by Yubico (affiliate link) which is the key recommended in the guide. 

The single biggest misconception about iOS is that it’s good digital hygiene to force quit apps that you aren’t using.

I’ve been trying to stamp this out amongst my friends and family for as long as I can remember. The system is smart; let it do what it was designed to do.

For what it’s worth, nearly everyone I’ve corrected on this has at some point been a convert from Android.

External link: Public Service Announcement: You Should Not Force Quit Apps on iOS

1. Sublime Text (Portable) + Plugins

By using Sublime Text with the Sublime Jekyll and Git plugins allows me to write and publish posts from within a single app. By-and-large, Sublime is the “Admin UI” for this website.

I tried using other editors, but I just couldn’t get them to work the way I wanted them to work. Your mileage may vary though, so if you don’t feel Sublime is still worth the license fee, Visual Studio Code is good.

Using the portable version of Sublime lets me drop it into a USB stick/Dropbox/whatever and take my settings with me. 99% of the time this works seamlessly, but now and then I’ll run into an environment issue breaking the link between Git and Sublime. This isn’t that big of an issue though — I just open up a terminal session for all of the Git commands instead.

2. Setup Some Templates

Sublime Jekyll gives a lot of handy commands for quickly working with Jekyll — ctrl-shift-p jnp to start a new post was my most used for quite some time. But manually typing in all of the front matter except the title was a PITA. I use front matter for controlling the “style” of different post-types on the front page, so it is important to my setup. Then I found out about the ability to create templates, which speeded things up considerably.

To use templates, create a _templates folder in your Jekyll site. Create one or more markdown files in there to correspond to your various front matter defaults. $n can be used to set tab-stops. For example, here are my “Post” and “Link” templates respectively:

---
layout: post
title: 
date: $1
---
---
layout: link
title: 
date: $1
link_title: $2
link: $3
---

Note the tab-stop marker on date which lets me quickly insert the full, formatted, date and time using a Sublime Jekyll command

3. Git, plus a Commit Hook

I can’t always install Ruby + Jekyll on the environment I’m working on, so I offload the compilation of the site to the web server itself. To automate things, I manage the source files with Git, push changes to a repository on the server, and a commit hook fires whenever the server receives an update.

If you followed my post on [setting up a blog with Caddy and Jekyll][g], you might already have this setup. If not, the details can be found in this guide on Digital Ocean, which I linked to from there.

[g]:{{site.url}}{% link _posts/2017-06-26-start-a-blog-with-caddy-and-jekyll.md %}

4. Write

This bit I’m still working on improving my efficiency at…

I’ve been using the own-brand Element Games Kolinsky Sable brushes for a while now, and I have nothing but really good things to say about them. For the money they are fantastic. Winsor and Newton Series 7’s are usually seen as the “gold standard” brushes for miniature painting (obviously it’s one of those subjective things) — I think the Element Games brushes are easily as good as the W&N’s I’ve used over the years. At £19 for the set of 3, it’s hard to argue against the value, either.

Don’t just take my word for it: multiple Slayer Sword winner David Soper likes them too. Interestingly, he found them to be “stiff” compared to the Series 7’s… I had the opposite experience in that mine were “softer” – particularly the “Regiment” brush – which has become my new go-to brush for most tasks.

I’m having a _lot_ of fun in Hitman. I’d go so far to say it’s surpassed _Hitman 2_ as my favourite entry in the series.
_Hitman 2_ was fun to play, but I never felt there was all that much replayability. Sure, you could go back and try a different approach, but you weren’t really rewarded for it. _Hitman (2016)_ has had me replaying the same level for days at a time, trying to beat all of the Challenges, get all of the Feats, and try out all of the Opportunities.
The approach I’m taking is to max out Mastery on each level and complete all of the Challenges/Feats/Opportunities before moving onto the next level, and I think it’s been the best approach to maximise my enjoyment of the game. Once I’ve done all of the levels then I will go back and play the Escalation modes and bonus missions.
Depending how much time I have in the evenings to do this, then I might be ready for the rumoured start of Season 2 in August…

I completely missed this when Agile Bits introduced their new 1Password.com product (which, admittedly, I didn’t really pay attention to), but standalone licenses for 1Password are no longer being marketed. If you want one, you have to email Support to get one.

As someone who’s bought multiple versions and upgrades of 1Password over the years, I’m a little torn over this. On the one hand, if there are genuine technical limitations caused by supporting standalone versions with local vaults, and the new platform provides a truely better experience, then great. On the other, I’m always wary of putting all my eggs in one basket… if Agile Bits were to suffer some catastrophe tomorrow then my standalone 1Password 4 and local vault wouldn’t be affected in any way at all. I’m mostly fine with the security aspects, as from what I can tell, even if they were to be breached, all a hacker could get would be an encrypted binary of your data.

Something for me to think about, I guess.

External link: 1Password Support Forum Thread

A powerful recount of Coraline Ada Ehmke’s terrible treatment at GitHub. Please take some time to read it.

I think back on the lack of options I was given in response to my mental health situation and I see a complete lack of empathy. I reflect on the weekly one-on-ones that I had with my manager prior to my review and the positive feedback I was getting, in contrast to the surprise annual review. I consider the journal entries that I made and all the effort I put in on following the PIP and demonstrating my commitment to improving, only to be judged negatively for the most petty reasons. I think about how I opened up to my manager about my trauma and was accused of trying to manipulate her feelings. I remember coming back from burying my grandmother and being told that I was fired.

GitHub has made some very public commitments to turning its culture around, but I feel now that these statements are just PR. I am increasingly of the opinion that in hiring me and other prominent activists, they were attempting to use our names and reputations to convince the world that they took diversity, inclusivity, and social justice issues seriously. And I feel naive for having fallen for it.

This isn’t the first time GitHub have run afoul of having a toxic internal culture. Perhaps ironically, it appears from my view that it was the “corporate” controls implemented after the previous fallout, combined with cultural issues and inexperience of how those controls are meant to be applied that led to the horrible experience.

I’ve done Staff Management in companies hundreds of times the size of GitHub; I recognise every tool and process mentioned in Coraline’s recount, and have been part of them numerous times. Each instance in this retelling seems to be a perversion of what is meant to be applied. Tools designed to help and protect everyone in involved in the process were turned against the party with the least power.

After my last post, I came across some discussion about implementing JSON Feed, and whether using a template is a good way to implement JSON Feed in a site. The concensus seems to be “No,” with one of the authors of the spec weighing in. For the most part, I do agree – a template is more likely to break under some edge case than a proper serializer – but until there is more support for the spec I see it as a pragmattic short-term solution. So proceed at your own risk for now!

I can’t take credit for this – I found the code below (from vallieres) after a quick search for adding feed.json to Jekyll without plugins. The only thing I’ve added is the sitemap front-matter which will exclude the output file from our sitemap.xml

---
layout: null
sitemap:
  exclude: 'yes'
---
{
    "version": "https://jsonfeed.org/version/1",
    "title": "{{ site.title | xml_escape }}",
    "home_page_url": "{{ "/" | absolute_url }}",
    "feed_url": "{{ "/feed.json" | absolute_url }}",
    "description": {{ site.description | jsonify }},
    "icon": "{{ "/apple-touch-icon.png" | absolute_url }}",
    "favicon": "{{ "/favicon.ico" | absolute_url }}",
    "expired": false,
    {% if site.author %}
    "author": {% if site.author.name %} {
        "name": "{{ site.author.name }}",
        "url": {% if site.author.url %}"{{ site.author.url }}"{% else %}null{% endif %},
        "avatar": {% if site.author.avatar %}"{{ site.author.avatar }}"{% else %}null{% endif %}
    },{% else %}"{{ site.author }}",{% endif %}
    {% endif %}
"items": [
    {% for post in site.posts limit:36 %}
        {
            "id": "{{ post.url | absolute_url | sha1 }}",
            "title": {{ post.title | jsonify }},
            "summary": {{ post.seo_description | jsonify }},
            "content_text": {{ post.content | strip_html | strip_newlines | jsonify }},
            "content_html": {{ post.content | strip_newlines | jsonify }},
            "url": "{{ post.url | absolute_url }}",
            {% if post.image.size > 1 %}"image": {{ post.image | jsonify }},{% endif %}
            {% if post.link.size > 1 %}"external_url": "{{ post.link }}",{% endif %}
            {% if post.banner.size > 1 %}"banner_image": "{{ post.banner }}",{% endif %}
            {% if post.tags.size > 1 %}"tags": {{ post.tags | jsonify }},{% endif %}
            {% if post.enclosure.size > 1 %}"attachments": [ {
              "url": "{{ post.enclosure }}",
              "mime_type": "{{ post.enclosure_type }}",
              "size_in_bytes": "{{ post.enclosure_length }}"
            },{% endif %}
            "date_published": "{{ post.date | date_to_xmlschema }}",
            "date_modified": "{{ post.date | date_to_xmlschema }}",
            {% if post.author %}
                "author": {% if post.author.name %} {
                "name": "{{ post.author.name }}",
                "url": {% if post.author.url %}"{{ post.author.url }}"{% else %}null{% endif %},
                "avatar": {% if post.author.avatar %}"{{ post.author.avatar }}"{% else %}null{% endif %}
                }
                {% else %}"{{ post.author }}"{% endif %}
            {% else %}
                "author": {% if site.author.name %} {
                "name": "{{ site.author.name }}",
                "url": {% if site.author.url %}"{{ site.author.url }}"{% else %}null{% endif %},
                "avatar": {% if site.author.avatar %}"{{ site.author.avatar }}"{% else %}null{% endif %}
                }
                {% else %}
                "{{ site.author }}"
                {% endif %}
            {% endif %}
        }{% if forloop.last == false %},{% endif %}
    {% endfor %}
    ]
}

Lately I’ve found myself enjoying the YouTube channel Outside XBox (and their sister channel, Outside Xtra). Normally what will happen is I’ll put a YouTube video on the TV as “background noise” and it ends up playing videos from the same channel for a few hours; in this case I ended up going down a rabbit hole of their Hitman videos. Whether it was the normal play throughs, the challenge modes, or the “3 ways to play”, I was hooked and wanted to play the game for myself. But I refuse to pay full price for a game that’s been out for a while, so it sat on my Steam Wishlist until last week, when it was discounted by 60% on the evening before the Steam Summer Sale.

I finally got round to playing Hitman today, and so far I’d say it’s got the potential to be the first game to really hold my attention in since Metal Gear Solid 5. I loved Hitman 2, and kept tabs on the ups and downs of the series after that, so having another game in the series feel as good as that one did is certainly welcome.

So far I’ve only completed the prologue missions (though I did replay the very first mission several times), so these are very early impressions. Fingers crossed the game holds up as I get further through it!

I caught the first trailer for the new Marvel’s Inhumans TV series. The whole thing looked so stiff, awkward, and sterile. I’m not sure what I was expecting, but based purely on the trailer I have zero desire to watch the show. Marvel’s movies and Netflix series (mostly) manage to feel somewhat anchored to the “real world” despite how fantastical the plot or setup might be… but Inhumans had none of that quality on show.

“We had no code and no art assets,” Blizzard 3D Art Director Brian Sousa confirmed to Ars Technica. The 2017 project’s entire art pipeline was “eyeballed,” Sousa said, with recovered concept artwork, sketches, and original boxes and manuals used as reference materials. Not all code was missing, as Blizzard has been issuing patches to the original game’s code base for nearly 20 years. Also, a member of the sound team thankfully had backups of the original sound and voice recordings, which are now reprocessed in higher-fidelity 44,100Hz format.

I’d heard the majority of the original Starcraft code had been lost, years ago, but I figured it was just a rumour. Sounds like the team of Starcraft: Remastered had a big task to recreate the game in a way some of its biggest fans would appreciate.

External link: StarCraft Remastered devs unveil price, explain how much is being rebuilt

I’ve listened to a lot of 40K podcasts over the last couple of years. Over that time I’ve slowly winnowed my subscriptions down to just a handful.

  1. Forge The Narrative – my favourite 40K podcast of the last few years.
  2. Chapter Tactics – from Frontline Gaming, but distinct enough from their other shows to merit its own subscription
  3. Frontline Gaming – this is the main Frontline Gaming Podcast – the feed also includes Chapter Tactics and some other smaller shows
  4. Ashes of the Imperium – this one is new, but it’s by the team behind the very good Bad Dice AoS Podcast

My biggest gripe with most 40K podcasts tends to be length. Sorry, but unless you’re very, very compelling to listen to, I am not going to listen to a Podcast episode which is 2-3 hours long (or more!). The Podcasts above tend to clock-in at around an hour to an hour and a half, which I find to be perfect to my listening habits.

Bonus: Podcasts I’m Evaluating:

8th Edition has brought about a few new Podcasts, some of which I’m still deciding if they’re going to stay in my subscriptions list.

Bonus 2: Some Age of Sigmar podcasts

For a while I found the quality of the AOS podcasts to be in general higher than most 40K podcasts, with only a couple of exceptions. Sadly, my favourite AOS podcast — Heelanhammer — has recently gone on hiatus so I’m not including it here.

  • Bad Dice
  • Facehammer – can be a bit sweary, so proceed under advisory if that’s not your thing.

For various reasons I prefer to remove the www part from my personal-use domains. Setting up Caddy to serve the site from just domain.com is as simple as:

domain.com {
    root /path/to/site/files
    # other directives
}

But this set-up doesn’t provide any way to redirect from www to non-www, meaning anyone who types www.domain.com into the address bar is out of luck. So what to do? Well, Caddy provides a redir directive. Combine with a new site directive and a placeholder like this:

# Original non-WWW site:
domain.com {
    root /path/to/site/files
    # other directives
}
# New, additional "site", for doing the redir
www.domain.com {
    redir domain.com{uri}
}

Having just spent faaaar too long to get a sample Liquid code block to not be parsed by Jekyll, I thought I better make note of this, for my own benefit:

When posting Liquid code, make use of the raw tag. Which I can’t seem to post an example of using, because it creates some sort of Inception effect or something…

An XML Sitemap can be useful for optimising your site with Google, particularly if you make use of their Webmaster Tools. Jekyll doesn’t come with one out-of-the-box, but it is easy to add one. There’s probably a plugin out there which will automate things, but I just used a normal Jekyll-generated file for mine, based on code found on Robert Birnie’s site.

The only modification I made was to exclude feed.xml from the sitemap. Because this is auto-generated by a plugin I couldn’t add any front-matter to a file to exclude it in the same way as other files.

Create a file called sitemap.xml in the root of your site, and paste the following code into it:

---
layout: null
sitemap:
  exclude: 'yes'
---
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  {% for post in site.posts %}
    {% unless post.published == false %}
    <url>
      <loc>{{ site.url }}{{ post.url }}</loc>
      {% if post.sitemap.lastmod %}
        <lastmod>{{ post.sitemap.lastmod | date: "%Y-%m-%d" }}</lastmod>
      {% elsif post.date %}
        <lastmod>{{ post.date | date_to_xmlschema }}</lastmod>
      {% else %}
        <lastmod>{{ site.time | date_to_xmlschema }}</lastmod>
      {% endif %}
      {% if post.sitemap.changefreq %}
        <changefreq>{{ post.sitemap.changefreq }}</changefreq>
      {% else %}
        <changefreq>monthly</changefreq>
      {% endif %}
      {% if post.sitemap.priority %}
        <priority>{{ post.sitemap.priority }}</priority>
      {% else %}
        <priority>0.5</priority>
      {% endif %}
    </url>
    {% endunless %}
  {% endfor %}
  {% for page in site.pages %}
    {% unless page.sitemap.exclude == "yes" or page.url=="/feed.xml" %}
    <url>
      <loc>{{ site.url }}{{ page.url | remove: "index.html" }}</loc>
      {% if page.sitemap.lastmod %}
        <lastmod>{{ page.sitemap.lastmod | date: "%Y-%m-%d" }}</lastmod>
      {% elsif page.date %}
        <lastmod>{{ page.date | date_to_xmlschema }}</lastmod>
      {% else %}
        <lastmod>{{ site.time | date_to_xmlschema }}</lastmod>
      {% endif %}
      {% if page.sitemap.changefreq %}
        <changefreq>{{ page.sitemap.changefreq }}</changefreq>
      {% else %}
        <changefreq>monthly</changefreq>
      {% endif %}
      {% if page.sitemap.priority %}
        <priority>{{ page.sitemap.priority }}</priority>
      {% else %}
        <priority>0.3</priority>
      {% endif %}
    </url>
    {% endunless %}
  {% endfor %}
</urlset>

If you want fine-control over what appears in the sitemap, you can use any of the following front-matter variables.

sitemap:
  lastmod: 2014-01-23
  priority: 0.7
  changefreq: 'monthly'
  exclude: 'yes'

As an example, I use this in my feed.json template to exclude the generated file from the sitemap:

sitemap:
  exclude: 'yes'

And this in my index/archive pages for a daily change frequency:

sitemap:
  changefreq: 'daily'

It’s super simple. Just include a push directive in your site definition. You can leave it as just that, and Caddy will use any Link HTTP headers to figure it out.

If you want more control, you can expand the directive and specify both the path and associated resources, like so:

example.com {
    root /var/www/example
    push / {
        /assets/css/site.min.css
        /assets/img/logo.png
        /assets/js/site.min.js
    }
}

What this block does is say “for every request with a base of / (i.e. every request), Push the following 3 files.” You can customise the base path if you want to, and add more files if you need, but a block like the one above is what I’m using for this site.

You can find out full details in the Caddy Push documentation.

Lately I’ve been feeling a pull to return to my Warhammer 40,000 Flesh Tearers army, which I started around 4 years ago (and promptly only completed one unit of). I had an idea of a small strike-force that was basically just a load of Jump Pack Assault Squads, supported by Land Speeders (with some Death Company elements thrown in). It wouldn’t have been very “competetive” but it would have been thematic and fun. I didn’t progress the idea very far, as the Blood Angels codex in 7th Edition was… very not good; it also took away the ability to field Assault Squads as a troops choice — rendering the entire idea invalid.

Now we’re in 8th Edition, I can build the army as I imagined it, using the new detachments in the rule book. By getting back to a small “passion project” of mine, I’m hoping I’ll be able to revive my motivation for hobby projects which has been worryingly low recently. Who knows — I might even add some Primaris Inceptors to the mix for some mobile firepower.