A new boxed game from Games Workshop, coming end of October/start of November, in the same style as Betrayal at Calth. New stuff includes plastic MKIII Marines, Tartaros-pattern Terminators, Legio Custodes, a couple of characters, and possibly more.

I’ve been looking forward to this release since I first heard about it back in March, so it’s great to finally get the confirmation it’s real.

Preamble: I’ve jumped back into WoW over the last couple of months. I’ve had an on-mostly-off relationship with the game over the last few years (since the end of Wrath of the Liche King, really), but the early promise of Legion brought me back.

I spent the time up until the launch 2 weeks ago getting acclimated to the game and all the changes I’d missed. I powered through Warlords of Draenor (WoD) in a few sessions, just through quest content (the massive amounts of rested XP helped a tonne!) then set about getting my professions up to max-level.

Professions in WoD very, very easy to level-up. I had both primary professions, and all secondary professions except Fishing maxed-out within a weekend1. If anything, I felt a bit underwhelmed by how easy it was. Between the Garrison and Auction House, it took very little time and money to gather everything needed to craft enough stacks of whatever recipe gave the most skill-ups2. Gathering professions didn’t take much effort either.


Professions in Legion are almost nothing like in WoD. In some ways it’s welcome; recipes have different levels of proficiency (1-3), which adds more interest to them. I haven’t found any recipes yet awarding multiple skill levels. There’s profession-related quest content – often to unlock a better proficiency – and even some of the new World Quests are profession-specific.

It’s just a shame that there’s one fundamental flaw: most of the progression is locked behind Dungeons. Take Alchemy as the prime example: after unlocking Legion Alchemy, you get 3 recipes with 1 or 2 more available from a vendor. These will give you a handful of skill levels at most. Beyond that, you have quests which require you to run dungeons to unlock more recipes. You have to run all of the dungeons in Legion, enter a busy free-for-all PVP area, and even complete a WoD Raid to unlock everything in Legion Alchemy. Other professions are similar.

I absolutely don’t mind having to work for progressing in professions, but this is disappointing. Gating them behind Reputations would have made much more sense to me, or some other mechanic which didn’t force players into LFG/LFR… The new World Quests are another decent candidate. In my experience most players in dungeons are just looking to speed-run bosses for gear upgrades and if you’re not OK with that they’ll make your life hell until you quit, or just /kick you outright.

It’s always been the case that the most lucrative recipes were acquired through dungeons… but these were optional/rare recipes, and mainly of interest to people looking to make serious gold on the Auction House, or needed to get it to help their competetive raid team. Levelling up your professions never needed these “high-end” recipes.

I stopped regularly running dungeons years ago because I had too many bad experiences with groups who were rude/intolerant of players “not as skilled” as they were/just plain assholes3. Since then, I’ve dipped in with ever more reluctance. I have zero desire or patience – let alone time – to deal with that again, and have zero faith it’s improved any. I’m sure I’m not the only player who feels like this, judging by posts on the Battle.Net forums and elsewhere.

For players like me, who have long turned away from the group aspects of WoW, professions are the end-game content4. Locking the means to meaningfully progress professions behind dungeons basically locks them away from us, unless we’re prepared to hold our noses and deal with aspects of the game we don’t necessarily enjoy.

It’s still early days for Legion, and even though I’ve hit 110 on my main, there’s still a tonne of other content I can play, for now. And other characters. I’m just hoping Blizzard do something to make the professions a bit more accessible in a future patch. They’ve achieved their intent to make them more interesting, I’m just not convinced they achieved the “fun” bit.


  1. Most of this was just travel time while levelling Archeology. 
  2. This was new to me – certain recipes giving more than 1 level at a time. 
  3. At first, LFG was a blessing compared to trying to organise a group through chat. Unfortunately it made it impossible to weed-out the bad-eggs before starting a dungeon. My feelings on WoW Dungeons are a topic for another time. 
  4. Alongside Reputations, exploring, lore, and Achievements. 

Lock Screen

Raise to Wake is a feature I’ve wanted for a while, so I love that. It sometimes seems a little sensitive, but I guess I’ll either get used to it, or it’ll be tweaked in a software update. The new behaviour of unlocking your phone without going to the Home Screen until you press the Home button seemed a bit unintuitive to me, I’ve changed a setting under General > Accessibility > Home Button to remove the press.

Notifications

Functionally, the new notifications are great, and will get better as more apps embrace the feature. Like others, I’m not a fan of the styling, which is very evocative of “Web 2.0”. Clear All is another minor feature I’ve wanted forever, so I’m glad that’s there; I just wish I hadn’t had to Google to discover it’s hidden behind a 3D Touch gesture. These hidden or unintuitive features and gestures are probably my biggest peeve with iOS 10 for now.

Related to the notification area, I don’t get why the “Today” widget area is duplicated here and to the left of the Home Screen. One or the other would’ve been better, at least in my opinion. Maybe because I never used the old “Today” screen, but did use the old search screen which used to be to the left of the Home Screen…

Messages

Overall I like the update, but I’ve found some of the new features to be really unintuitive to use. The message styles (hidden ink, balloons, etc) are hidden behind a 3D Touch of the send button – so if you don’t get it right you’ll find yourself accidentally sending the message before it’s finished. This is a very minor thing, but it does cause frustration. I also found the Digital Ink features to be confusing to use, and the associated gestures a bit hit-and-miss. “Playback” of these messages is also hit-and-miss: sometimes they play automatically, but most times they don’t.

This article from The Verge has a good rundown of the new features of iMessage and how they work.

Other

Being able to (finally) remove in-built apps is obviously something which has received some headlines. Surprisingly, I’ve removed fewer than I expected… I think it’s only Stocks, Tips, Find My Friends and weather. I’ve actually found myself switching to a couple of the in-built apps

If – like me – you were eying up the Volkite Weapon Kits from Forge World as a means of expanding the Betrayal At Calth box set, but we’re dismayed to find them sold out and “no longer available” (as opposed to “Temporarily out of stock”), then fear not!

Forge World have your back, according to a reply I got when I asked about the missing kits:

image

Not only will the weapons coming back in an improved form, but other kits will be getting a refresh, along with new accessory packs.

I’ve been using iCloud Photo Library (iCPL) for the last few months, basically since the day it went to Public Beta. It was one of the features I was most excited about for iOS 8 and OS X Yosemite. The idea is fantastic – all your photos available on all your (Apple) devices, and it’s integrated with what is probably your most frquently used camera, so new photos are automatically added.

When it works, it’s seamless and brilliant, and I can’t say enough good things about it… but this morning I turned it off on my iPhone and won’t be switching it back on any time soon.


Here are the two major problems I’ve had with it:

1. It causes (most) apps accessing the photo library to run extremely slow

Anytime I open an app which wants to access the photo library, that app tends to hang for a few seconds. This is easiest to see in something like Instagram, where if you go to add a picture, the icon in the bottom left which lets you select an existing image will show as blank for several seconds while it loads the first thumbnails. I’ve seen similar behaviour in the stock Camera app, and numerous image editors.

2. It absolutely destroys my mobile data allowance

I have a 4GB data allowance on my 4G data plan. When I have iCloud Photo Library enabled on my iPhone – even after syncing the entire library over WiFi before leaving the house – within a couple of days I will get a text message from my network telling me I’ve only got 200MB of my allowance left. This happens even after disallowing the Photos app from using mobile data, so it’s obviously some other process running in the background. To be clear: without iCloud Photo Library turned off, I have never been close enough to my data cap to trigger a warning; with it turned on, I use up my entire allowance within a few days.

This morning, in the space of 2 hours
This morning, in the space of just 2 hours

The first problem of slowness has improved with the iOS 9 public betas, but #2 is still happening. A lot. It’s probably cost me upwards of £60 in increased mobile phone bills over the last few months. And this is before we get to other issues, including: either iCPL/the new Photos app screwing up the metadata on a whole bunch of photos1; occaisional sync conflicts2; problems caused by turning it off because of the other issues3.

By and large, I get the impression I’m the outlier. For most people, iCloud Photo Library works without issue and they’re happy with it. Hopefully it’s the same for you! But for me it just doesn’t work reliably enough without some serious downsides.

What’s your experience of iCloud Photo Library been like? Let me know!


  1. I found this one out when I tried importing my library into Google Photos and Dropbox for redundancy. Roughly 2500 photos no longer have any date information associated with them, so both services sort them into the day they were uploaded, completely ruining any logical grouping. 
  2. In iOS 8, if I quickly edited a new picture on your device, while it synced to your other devices, one of two things would happen; 1: only the edit would sync, or 2: your edit would be discarded when the sync finished. 
  3. What should happen is your iCPL photos are removed from the device, apart from the Camera Roll pictures on the device previously. Except, it usually turns into a crap-shoot as to which photos are kept. And sometimes, despite removing all these photos, the storage space isn’t freed up afterwards. Which is awesome when you only have a 16GB device. 

LotV-Screenshot-04

Never, ever pre-order games. That’s the general rule, especially given such recent debacles as the PC version of Arkham Knight.

But as with every rule, there are exceptions. The final entry in the Starcraft 2 series is – for me – one of them. Within 20 minutes of learning the pre-order for the digital editions was live, I had the deluxe edition ordered.

Starcraft is one of those rare gaming series I hold dear; I played the original on my very first PC, fell in love with the story, then had to wait over a decade until Starcraft 2 came along. When the first part, Wings of Liberty, arrived 12 years after Starcraft 1, I fell in love all over again, and all fears about how splitting up the game into 3 would work out. The 3 year wait for Heart of the Swarm was agonising — I loved the story being told so much I just wanted more!

Now we’re nearing the end, at last. The final chapter in the Starcraft saga will arrive sometime between now and March, and I can’t wait.

I have none, because we haven’t seen enough full information – in context – to make any informed opinions.

And neither have you. I get it, change is scary. But stop whining on the internet about AoS before you have all the information. Please? It’ll make the transition much more pleasant for you, me, and everyone else.

I’m flabbergasted by how quickly it all went from “ok, this looks like it could be fun and interesting,” to “ZOMG! The sky is falling! F-you GW! This is the most ridiculous and crappy game EEEEHHHVVAR!

And it hasn’t even been officially revealed yet. Careful; your knee is jerking so hard you might do yourself an injury.

I do have one final, parting thought to leave you with:

If you want a balanced, tournament-friendly (and 1st-party supported!) Fantasy massed-battle game that plays like a “Warhammer 9th” – basically what everyone complaining the loudest seems to be lamenting Age of Sigmar is not – then I humbly suggest you go check out Kings of War. 2nd Edition is right around the corner, with the beta rules available for free download. A number of Warhammer Fantasy armies port over to KoW with little-to-no modification or need to buy new models. It’s fast, deceptively simple, fun, well written, and actively supported. If you’re up in arms about AoS, it wouldn’t hurt to check it out.

I mentioned a few weeks back I was considering my choices for how to upgrade my aging computer equipment, and of the choices, building my own custom PC would be the most rewarding path to take. I swithered a bit on whether I really wanted to do this, but in the end I gave in to the temptation to build something entirely my own.

Great, I know what I want to do, now how do I get there? It’s been several years since I built a PC1, and I haven’t been keeping up with the trends, or what’s the latest and greatest in terms of performance, price, or anything really.

I had a few ideas of what I wanted – it needed to be small, as space in the office is at a premium. It needed to be as powerful as I could afford, so it would last a decent amount of time until it needed major upgrades, while being flexible enough to tackle many different types of task – development, gaming, photo (and potentially basic video/audio) editing, for example. In a perfect world, I wanted it to be as quiet as possible and look good.

The last few weeks have been spent doing research, going back and forward over potential configurations using PC Part Picker before settling on an outline of what I wanted. I took it over to /r/BuildAPC for a sense check, and was told my best bet was to change the graphics card for something more powerful than I had picked out. I rejigged a few things to make that possible, and ended up with the spec below:

TypeItem
CPUIntel Core i5-4690K 3.5GHz Quad-Core Processor
CPU CoolerCooler Master Nepton 120XL 76.0 CFM Liquid CPU Cooler
MotherboardAsus MAXIMUS VII IMPACT Mini ITX LGA1150 Motherboard
MemoryCorsair Vengeance Pro 8GB (2 x 4GB) DDR3-1866 Memory
StorageSamsung 850 EVO-Series 250GB 2.5″ Solid State Drive
Video CardGigabyte GeForce GTX 960 2GB Video Card
CaseSilverstone FT03B-MINI (Black) Mini ITX Tower Case
Power SupplySilverstone 500W 80+ Gold Certified Fully-Modular SFX Power Supply

The graphics card might still be swapped for another, similarly specced one, but otherwise this is what I’ll be building in a little over a week’s time, when I have some time off. I’ll be talking more about the build, closer to the time, as I have a few things planned which will make it a bit more interesting than just a straight PC build


  1. It was in 2008. I checked my order history. 

Over the last couple of weeks, my iPhone 5S has been rebooting itself during the night. Once (last Saturday) it got stuck in a reboot loop on the Apple logo screen. Strangely, it seemed to be emitting some kind of tone every time it restarted… maybe that was my woken-at-3am brain imagining things, but I’m sure it also made a noise in the early hours of this morning when it rebooted.

The most annoying thing about this, is that it’s only happening at night, while I’m asleep. I know it’s happening because my lock screen tells me so, and I can’t use TouchID to unlock the phone. That, and the fact the display flashing up the stark white loading screen sometimes wakes me up. Throughout the day, everything appears fine. It’s really quite bizarre.

I’d reset the phone to factory settings, but there are a couple of security-related apps installed which would be a massive PITA to have to de-authorise and set up again.

Has anyone else experienced this?

Earlier on I was trying to find a way to “downgrade” a Google Apps account to a personal account. Well, I found a way. Kinda. Ok, not really – I slipped up and deleted my Google account.

I was a bit naive about what removing a Google Apps subscription entailed. In the absence of any clear documentation, I assumed hoped it would remove the baggage of Google Apps, leaving me with a normal Google personal account (especially as the account predated Apps). It didn’t actually remove Google Apps… but it did remove my access to pretty much every useful Google service. I was locked out of Drive/Docs, Browser Sync… everything I use on a regular basis.

It turns out, that if you want to delete Google Apps, cancelling your subscription is only a partial measure. Whereas in most services “cancel subscription” means “I’m done, so remove all my stuff and let me go” if you want to cancel Apps then you have to cancel, and then do the non-obvious step of explicitly deleting your domain from the service.

At this point, my choice was: buy a new subscription to Apps, putting me back to square one – only paying for it, or completely delete everything to do with the Apps account. So deletion it was.

Eventually I tracked down where in the mess that is the Apps admin area I could find the delete domain button, held my breath, and clicked.

Milliseconds later I was dumped out of Google Apps, and everything was gone. Everything.  Even the stuff you’d forgot about, like your Google+ profile, oAuth logins to other sites or logins on other devices, and accounts you forgot were merged, i.e. my YouTube account and subscriptions. My iPhone complained, WordPress complained, Feedly complained, Chrome complained, and so did many, many more! Years of settings, data, and integrations, gone in a button click.

Immediately I had a wave of regret, but also a slight sense of a weight being lifted. I no longer had to worry about the schizophrenic nature of my old account. If I wanted to try a new Google service, I didn’t have to wait for it to be Apps-enabled. Yes, a whole bunch of data was gone, but in a way, that was good. I would be starting over from scratch, without all the cruft that had accumulated over the many years.

So I guess it’s not that bad, really. Just a little inconvenient in the short-term. I’ve created a new account, relinked any complaining devices, and generally started rebuilding.

But please, Google, make the whole Apps/Account integration more user-friendly!

I like to think of myself as generally a smart person. I have my weaknesses, but I’m usually pretty good at figuring something out – particularly if it’s tech related. Problem solving is generally one of my strong points.

So why, oh why, can I not figure out how to “downgrade” or migrate a Google Apps account to a “normal” Google account?

For background, I have a legacy Google Apps account, from when I used to run my own-domain email account through the service. I switched to Fastmail a couple of years ago, but by this point the Apps account was my “main” Google account – the one I was logged into all the time and thus had my data attached to.

I wanted to get rid of the Apps part of the account, as it causes some weird issues now and again, doesn’t work with all Google services, and I don’t use it for the intended purpose any more.

But it’s increasingly looking like this might not be possible. I can think of a number of enterprise-y reasons why not, but I can also think of a few use cases where it should be possible to at least allow it. I’ll keep hunting for now.

Note: I found this mini How-To while having a clean-up of my GitHub repositories. I figured it would be worth sharing on my blog. Hopefully it is of use to someone. Warning: bad ASCII art ahead!


The Problem

  1. I have my repository hosted on GitHub
  2. I have an internal Git server used for deployments
  3. I want to keep these synchronised using my normal workflow

Getting Started

Both methods I’ll describe need a “bare” version of the GitHub repository on your internal server. This worked best for me:

cd ~/projects/repo-sync-test/
scp -r .git user@internalserver:/path/to/sync.git

Here, I’m changing to my local working directory, then using scp to copy the .git folder to the internal server over ssh.

More information and examples this can be found in the online Git Book:

4.2 Git on the Server – Getting Git on a Server

Once the internal server version of the repository is ready, we can begin!

The Easy, Safe, But Manual Method:

+---------+ +----------+ /------>
| GitHub  | | internal | -- deploy -->
+---------+ +----------+ \------>
^                     ^
|                     |
|     +---------+     |
\-----|   ME!   | ----/
      +---------+

This one I have used before, and is the least complex. It needs the least setup, but doesn’t sync the two repositories automatically. Essentially we are going to add a second Git Remote to the local copy, and push to both servers in our workflow:

In your own local copy of the repository, checked out from GitHub, add a new remote a bit like this:

git remote add internal user@internalserver:/path/to/sync.git

This guide on help.github.com has a bit more information about adding Remotes.

You can change the remote name of “internal” to whatever you want. You could also rename the remote which points to GitHub (“origin”) to something else, so it’s clearer where it is pushing to:

git remote rename origin github

With your remotes ready, to keep the servers in sync you push to both of them, one after the other:

git push github master
git push internal master
  • Pros: Really simple
  • Cons: It’s a little more typing when pushing changes

The Automated Way:

+---------+            +----------+ /------>
| GitHub  |   ======>  | internal | -- deploy -->
+---------+            +----------+ \------>
^
|
|              +---------+
\------------- |   ME!   |
               +---------+

The previous method is simple and reliable, but it doesn’t really scale that well. Wouldn’t it be nice if the internal server did the extra work?

The main thing to be aware of with this method is that you wouldn’t be able to push directly to your internal server – if you did, then the changes would be overwritten by the process I’ll describe.

Anyway:

One problem I had in setting this up initially, is the local repositories on my PC are cloned from GitHub over SSH, which would require a lot more setup to allow the server to fetch from GitHub without any interaction. So what I did was remove the existing remote, and add a new one pointing to the https link:

(on the internal server)
cd /path/to/repository.git
git remote rm origin
git remote add origin https://github.com/chrismcabz/repo-syncing-test.git
git fetch origin

You might not have to do this, but I did, so best to mention it!

At this point, you can test everything is working OK. Create or modify a file in your local copy, and push it to GitHub. On your internal server, do a git fetch origin to sync the change down to the server repository. Now, if you were to try and do a normal git merge origin at this point, it would fail, because we’re in a “bare” repository. If we were to clone the server repository to another machine, it would reflect the previous commit.

Instead, to see our changes reflected, we can use git reset (I’ve included example output messages):

git reset refs/remotes/origin/master

Unstaged changes after reset:
M LICENSE
M README.md
M testfile1.txt
M testfile2.txt
M testfile3.txt

Now if we were to clone the internal server’s repository, it would be fully up to date with the repository on GitHub. Great! But so far it’s still a manual process, so lets add a cron task to stop the need for human intervention.

In my case, adding a new file to /etc/cron.d/, with the contents below was enough:

*/30 * * * * user cd /path/to/sync.git && git fetch origin && git reset refs/remotes/origin/master > /dev/null

What this does is tell cron that every 30 minutes it should run our command as the user user. Stepping through the command, we’re asking to:

  1. cd to our repository
  2. git fetch from GitHub
  3. git reset like we did in our test above, while sending the messages to /dev/null

That should be all we need to do! Our internal server will keep itself up-to-date with our GitHub repository automatically.

  • Pros: It’s automated; only need to push changes to one server.
  • Cons: If someone mistakenly pushes to the internal server, their changes will be overwritten

Credits

I’m in the market for a new computer1, but I have no idea what way to go. I’ve been making do with older kit for the last few years, but all of it is pretty much at the end of its usable life.

I recently set up a new “office” area in the house, and the way I did it allows me to swap between my work-supplied laptop, and a computer of my own, just by plugging into the right monitor input and swapping a USB cable. This setup also allows my son to make use of the desk if he needs to.

Until recently, the computer I used most around the house was a 9 year old Dell Latitude laptop which I had made usable by putting an SSD into it, and building a lightweight Arch Linux installation. This was primarily because a laptop was all I had space for. Actually, I tell a lie – the “computer” I use most is my iPhone, but for times the iPhone can’t cut it (for whatever reason) I used the Dell2. While this arrangement worked, it showed its age, and it was fiddly at times.

I’ve had a 6 year old Mac Mini lying around for a while, doing nothing. It’s only barely more powerful than the Dell3, and the one time I had it plugged into the living room TV, it was just plain awkward to use. With the new office I was able to plug it in to a proper monitor/keyboard/mouse arrangement which made it more viable. So this past weekend I took the SSD from the Dell, put it in the Mac, and made that my “home computer.” It’s just fast enough to not induce rage when trying to do anything more taxing than surf the web and other light duties.

Now I’ve got a “proper” desk and space, I’ve been thinking I should look getting something which will last me another few years. The cheapest upgrade I could do is to spend ~£60 and double the RAM in the Mac Mini, going from 4GB to 8GB. I’m sure that will give a noticable boost to OS X, but it doesn’t really change the fact the system is on borrowed time. It could buy me another 6-12 months, but at some point, likely soon, something is going to fail. The way I see it, my choices are:

  1. Buy a newer Mac, probably a laptop for flexibility (plus that’s where all their non-iOS/Watch innovation seems to be going).
  2. Buy a Windows laptop.
  3. Build a custom PC.

Of the choices, #3 is likely the most satisfying, and would have the most upgrade potential further down the line, though I would be constrained later by choices I made now. It also has the potential to get very expensive; I priced up a high-end Mini-ITX system for a bit of fun, and it came to roughly £1000 before choosing a suitable graphics card. I could definitely price something for less, and would probably have to, but it would have to be weighed against longevity of usable performance and upgradability. I am a little space constrained, so a massive tower is never going to be practical, but there are plenty options between Mini-ITX and mATX nowadays.

A Windows laptop feels like it would be a cop-out, and there’s not much out there I feel inspired enough to part with my money for. There’s a couple of nice laptops I’ve seen4, but none I feel would last as long as I’d like them to.

Getting a new Mac has been the direction I’ve been leaning towards for a while, but I’ve always struggled to justify it vs. other spending priorities. Plus, when you factor in how fast Apple iterate their hardware, the lack up after-sale upgradability, and you’re always hoping to “time it right”. That said, as an iPhone/iPad owner there’s a lot of upside to getting a Mac, for example: close integration through Handover/Continuity (granted, which I can’t currently use with the Mini), and iCloud Photo Library. I guess I could set up something more “cross-platform” for the photo library, using Dropbox, but I found Apple’s solution to be that little bit nicer to work with.

So the jist of this much-longer-than-I-planned stream of consciousness is that I need to start thinking about replacing the old and almost busted computer kit I have with something new. I don’t know what that will be yet, and I’d hoped getting my thoughts out would help me focus my mind on what I want to do.

No such luck though. Any ideas?


  1. Anyone who knows me probably knows I’ve actually been talking about it for ~4 years. 
  2. And what of my iPad? I mainly just use it for Hearthstone and Games Workshop rulebooks. Since iOS 8 (I think), my iPad has taken a huge hit in performance, and just isn’t as capable as it once felt. 
  3. On paper, at least. In practice it was severely hamstrung by the old-school HDD and running OS X. 
  4. My work laptop is quite nice; it’s a Dell Ultrabook, thin, light, and performant enough. But the consumer pricing is higher than I’d value it at. 

With all the cool new stuff constantly being released by recently, it can be very easy to end up with a large hobby backlog. When this happens it’s possible to get overwhelmed by your “to do list,” and it starts to become a mental drag; when this kicks in, your hobby no longer feels fun and instead feels like working a job you hate. Sometimes it’s just best to declare something a lost cause and just start over afresh.

I went through this very recently. My backlog had grown too big for me to see sight of the end of it – especially with the glacial pace I paint at! When I took stock of what was in the queue I had 2 full armies: a jump-heavy Flesh Tearers list, and a mechanised Tempestus Scions list. Not counting fun stuff like vehicles and characters, I had well over 100 models to prepare, assemble and paint… and these are just the army projects! Throw in various starter boxes for other games, and other sundry small projects, and the list was nearer 400.

Too. Damn. Many.

What to do? My initial plan was to freeze buying anything new until I’d whittled the backlog down to a more manageable level. Such a sensible plan might work for many a struggling hobbyist, butnfortunately, it was not the right plan for me. Despite several months of not buying any new figures1, I made zero impact on the pile of miniatures I had to work through. On top of that, I found myself losing all inspiration for certain projects. Some of that came down to gnawing insecurities about being able to achieve the vision I had in my head, others from indecision about what that vision even was any more. In the end there was just a pile of boxes and sprues causing me to feel terrible every time I thought about it. This was no longer a hobby, it was a chore. Something had to give, and it would be great if it wasn’t me.

In the tech world, there’s a popular approach to email management called Inbox Zero. The idea is to have your email inbox as empty as possible, so the amount of time your brain is occupied by email is as close to zero as possible. The intention is to reduce the distraction and stress caused by an overwhelmingly full inbox. Related to Inbox Zero, is Email Bankruptcy – the practice of deleting all email older than a certain date (often that day) due to being completely overwhelmed.

One day I realised I needed to declare something similar – Hobby Bankruptcy – or I was going to drive myself out of a hobby I’ve loved for over 20 years.

https://twitter.com/atChrisMc/status/585733864183767042

How was I going to do this? Throwing out hundreds2 of pounds of miniatures would be insane, especially if I changed my mind about something. Selling would take too long, and was subject to the fickleness of others. The simplest (non-destructive) solution won out: I took everything 40K/WHFB related, and stashed it in the loft. Out of sight; out of mind. Literally. The only survivors of the “purge” were source books and the limited edition 25th anniversary Crimson Fists miniature.

https://twitter.com/atChrisMc/status/585836777702875136

https://twitter.com/atChrisMc/status/585841570697609217

I can’t express just how much of a weight off doing this has been. I’m no longer under (self imposed) pressure to work through a massive backlog I no longer had the enthusiasm for, and yet, if I rediscover that enthusiasm, I can pull individual kits from the loft to work on as and when I want to.

In the meantime though, I am free to start work on new projects3

And yes, I do know I’m crazy.


  1. And growing increasing anxious about not getting the cool new shinies. 
  2. OK, maybe it’s higher… 
  3. Obviously, any new projects will have much more strict rules around the number of models allowed in the queue at once. No more buying entire armies in one go! 

Thanks to using 2 cork stoppers to elevate the back of the laptop up about an inch.

Laptop cork legs

Typing on this thing (a Toshiba R500) has been abysmal for the 4 years I’ve had this laptop. The keys are slidey, mushy, inconsistent, and generally just a mess of bad design and ergonomics. Tilting the laptop at least makes it more comfortable. Thankfully it’s being replaced soon, but boy do I wish I’d thought of this a lot sooner!

I’ve had my GMail address for several years now; I don’t really use it for anything more than legacy accounts, logins, or as a spam trap. For the most part it just sits there in the background, silently passing on any messages it receives to my “proper” account, which is email with a custom domain hosted on Fastmail.

Over the last 12-18 months, I’ve been receiving a slow-but-steady stream of mail clearly meant for someone else: newsletters mostly, but occasionally something personal, and the odd booking confirmation. At first I put these down someone mistyping an email address now and then, or something to do with how GMail has fun with dots (“.”) in email addresses1. Whatever the cause, at first I would just delete them as soon as I realised they weren’t intended for me.

Over time though, it became apparent someone genuinely thought my GMail address was theirs. The nature of the emails became more personal, and there was an increasing variety of individuals and organisations mailing the address, and increasingly with information you wouldn’t want to miss. I’m guessing from the nature of the mail that they are older, but that’s just a guess. The profile I’ve built up is (I’ve written some details more vague than I know them to be, and excluded others):

  • They live in an area of North London
  • They are a member of a residents committee
  • They have an elderly/sick family member or friend they wanted to keep up to date on
  • They used to use Eurostar semi-regularly
  • They recently decided to get their garage converted

Where before I used to just delete immediately, now I have taken to responding to certain mails, to let senders know they have a wrong address – in the hope they can let the intended recipient know they’re giving out the wrong address. Beyond this, I don’t know what to do… it’s not like I can email them to say!


  1. If you didn’t know, you can place a dot anywhere in a GMail address, and it will still resolve to your address. Another tip is you can “extend” your email address with a plus (“+”) and anything you like which gives you potentially unlimited addresses for the price of one. For example, test+something@gmail.com will resolve to test@gmail.com. I would use it for potential “throwaway” addresses 

Ads and websites which automatically redirect your iPhone to the App Store1 need to stop being a thing.

I’m seeing more and more instances of this user-hostile behaviour happening when I’m following a link on my phone. Usually it’s caused by an ad unit on the page, but now and again, it’s a site publisher who really, really, wants you to install their app.

Here’s the thing: if I wanted your app, I’d likely already have it installed. If I open a link to your website, I expect to (and am happy to) access your content there. Redirecting me to the App Store is a massive inconvenience and interruption; it takes me out of the app I was already using – often after I’ve already started reading your content – and puts me somewhere I wasn’t expecting to be. It breaks my concentration as my brain switches from reading your content to looking at the app download page. Assuming I still want to read your content after being treated like this, I now have to close the App Store, reopen the app I was just in, and hope I can pick up where I left off. The publishers who treat their users in this way seem to think I’ll:

  • Download the app, and wait for it to install
  • Create the usually mandatory account
  • Validate said account by switching to my email
  • Reopen the app, and try to find the content I’d clicked through to read in the first place
  • Read it (at last!)

Err, how about “no”? I was already reading your content. If you want to pimp your app to me, put a button or mention of it at the end of the article.

When this kidnapping of my attention is caused by an ad, I’ll sometimes go back to the site to finish reading, or I’ll go back to where I found the link, and send it to Pocket to read later instead (and without the ads to interrupt me). When it’s the publisher itself, chances are I’ll be annoyed enough I won’t return. You had your chance, and you chose to send me elsewhere instead. Either way, I sure as heck won’t install any app advertised using this method.

So can we please put a stop to this? It’s even worse than interrupting me to beg for an app review.


  1. This probably applies to Android and the Play Store as well, but I’m on an iPhone and so that’s where I have experience of this problem happening. 

I’ve written previously about how the archives of my blog were less full than they should be – that, between domain changes, server/CMS moves, and times when I simply didn’t care, there were potentially hundreds of posts missing from the early years in particular.

Back up your crap, people – including your blog.

For the last couple of years I’ve had an on-off project to restore as much of this personal history as possible. Every so often I’d go ferreting through old hard disks, or exploring the Internet Archive’s Wayback Machine for old content I could salvage. At first I had limited success, turning up only a handful of posts. Of those, I was fussy and only restored the “worthwhile” posts – usually longer posts about big events, or technical in nature.

This last weekend though, I revised my stance on this. If I was going to recreate my blogging history, I couldn’t – shouldn’t – just cherry-pick. I should include as much as I could possibly recover: the good, the bad, the plain inane. Anything less would feel a bit dishonest, and undermine the raison d’etre of the whole endeavour: saving the past.

The only exception would be posts which were so incomplete due to missing assets (images mainly) that any body text made no sense, or posts which were completely unintelligible out of context of the original blog – entries about downtime, for example. Also excluded were my personal pet peeve – posts “apologising” for the time between updates1!

A Brief Synopsis of the “How”:

To bring the past kicking and screaming into the present, I dove back into the Wayback Machine, going as far back on my first domain as I could. From there I worked as methodically as I could: working from the furthest back onwards, post-by-post. The basic process was:

  • Copy the post text and title to the WordPress new post screen
  • Adjust the post date to match the original
  • Where possible, match the original publishing time. Where this wasn’t available, approximate based on context (mentions of morning/afternoon/evening, number of other posts that day, etc)
  • Check any links in the post (see below)
  • Add any recovered assets – which was rare
  • Turn off WordPress social sharing
  • Publish

I started on the Friday afternoon, and manually “imported” around 50 posts in the first batch.

Turning off social sharing was done so I didn’t flood my Twitter followers with a whole load of links to the posts – some over a decade old. One thing I didn’t anticipate though, and which I had zero control over, was WordPress emailing the old posts to those who had subscribed to email notifications. It wasn’t until a friend IM’d me about her full inbox that I realised what was happening – so if you found your mail filled with notifications as a result of this exercise, I apologise!

To get around this, I ended up creating a new, private WordPress blog to perform the initial manual process, so I could later export a file to import into this blog.

Between Saturday, Sunday, and Monday evenings, I tracked down and copied over a further 125 or so posts. Due to the vagaries of the Wayback Machine, not every post could be recovered. Generally speaking, it was reliable in having a copy of the first page of an archive section, but no further pages. Sometimes I could access “permalink” pages for the other posts, but this was really hit-or-miss. A lot of the time the page the WBM had “saved” was a 404 page from one of my many blog reorganisations over the years, or in other cases, it would have maybe one post out of eight.

I made a rule not to change the original posts in any way – no fixing of typo’s/correcting something I was wrong about. The only thing I would do, was mark where there was a missing asset with an “Editors Note” of some sort, when appropriate. The only content I did have to consider changing were links.

Dealing with Links

One thing I had to consider was what to do about links which might have changed or disappeared over time. When copying from the WBM, links had already been written to point to a (potentially non-existent) WBM archive page, but if the original still existed, I wanted to point to that instead. In the end I would have to check pretty much every link by hand – if the original existed, I would point to that page; if not, I would take a chance with the Wayback Machine. In some cases I had to consider what to do where the page existed, but had different content or purpose to the original. I dealt with these on a case-by-case basis.

For internal links, I pointed to an imported version, if it existed, or removed it if there was none and context allowed.

Wrapping Up

In total, I imported around 175 previously “lost” blog entries, covering 2002-2006, with the majority from 2005. These years have gone from having a handful of entries, to having dozens. Overall, this has grown the archives by roughly 50% – a not so insubstantial amount!

At some point I will go back and appropriately tag them all, but that’s a lower priority job for another time.

2007-2010 were years when my writing output dropped a lot, so while I will look for missing entries from this period, I don’t expect to find many at all.

Side Note: History Repeats

I discovered, in the process of doing all this, that I had gone through the same exercise before, roughly 10 years ago!

Over the last few days, I’ve been working on the archives of my old site; cleaning and recategorising them. Today, I have added them to the archives of Pixel Meadow.

These additions represent everything that was left of ChrisMcLeod.Net. Over the course of its life many changes occured and data was lost – so these additions don’t represent everything that I’ve written there over the years.

You would think I might have learned from this mistake back then, but obviously not! Fingers crossed it’s finally sunk in.


  1. Though only where they had no other content to the post. 

The Reading List is a round-up of interesting blog posts and articles I’ve recently read, curated and posted every couple of days.

The Reading List is a round-up of interesting blog posts and articles I’ve recently read, curated and posted every couple of days.

The Reading List is a round-up of interesting blog posts and articles I’ve recently read, curated and posted every couple of days.

The Reading List is a round-up of interesting blog posts and articles I’ve recently read, curated and posted every couple of days.

The Reading List is a round-up of interesting blog posts and articles I’ve recently read, curated and posted every couple of days.

The Reading List is a round-up of interesting blog posts and articles I’ve recently read, curated and posted every couple of days.