Amazing to think I’ve won £250,000 without even applying. How did they even find me? #miracle #spamfolder
It’s quite possible I have a “problem”… #PepsiMaxAddict ?
A celebratory Wispa Gold, to kick off my vacation #ChocolateBarOfChampions
I’ve been using iCloud Photo Library (iCPL) for the last few months, basically since the day it went to Public Beta. It was one of the features I was most excited about for iOS 8 and OS X Yosemite. The idea is fantastic – all your photos available on all your (Apple) devices, and it’s integrated with what is probably your most frquently used camera, so new photos are automatically added.
When it works, it’s seamless and brilliant, and I can’t say enough good things about it… but this morning I turned it off on my iPhone and won’t be switching it back on any time soon.
Here are the two major problems I’ve had with it:
1. It causes (most) apps accessing the photo library to run extremely slow
Anytime I open an app which wants to access the photo library, that app tends to hang for a few seconds. This is easiest to see in something like Instagram, where if you go to add a picture, the icon in the bottom left which lets you select an existing image will show as blank for several seconds while it loads the first thumbnails. I’ve seen similar behaviour in the stock Camera app, and numerous image editors.
2. It absolutely destroys my mobile data allowance
I have a 4GB data allowance on my 4G data plan. When I have iCloud Photo Library enabled on my iPhone – even after syncing the entire library over WiFi before leaving the house – within a couple of days I will get a text message from my network telling me I’ve only got 200MB of my allowance left. This happens even after disallowing the Photos app from using mobile data, so it’s obviously some other process running in the background. To be clear: without iCloud Photo Library turned off, I have never been close enough to my data cap to trigger a warning; with it turned on, I use up my entire allowance within a few days.
The first problem of slowness has improved with the iOS 9 public betas, but #2 is still happening. A lot. It’s probably cost me upwards of £60 in increased mobile phone bills over the last few months. And this is before we get to other issues, including: either iCPL/the new Photos app screwing up the metadata on a whole bunch of photos1; occaisional sync conflicts2; problems caused by turning it off because of the other issues3.
By and large, I get the impression I’m the outlier. For most people, iCloud Photo Library works without issue and they’re happy with it. Hopefully it’s the same for you! But for me it just doesn’t work reliably enough without some serious downsides.
What’s your experience of iCloud Photo Library been like? Let me know!
- I found this one out when I tried importing my library into Google Photos and Dropbox for redundancy. Roughly 2500 photos no longer have any date information associated with them, so both services sort them into the day they were uploaded, completely ruining any logical grouping. ↩
- In iOS 8, if I quickly edited a new picture on your device, while it synced to your other devices, one of two things would happen; 1: only the edit would sync, or 2: your edit would be discarded when the sync finished. ↩
- What should happen is your iCPL photos are removed from the device, apart from the Camera Roll pictures on the device previously. Except, it usually turns into a crap-shoot as to which photos are kept. And sometimes, despite removing all these photos, the storage space isn’t freed up afterwards. Which is awesome when you only have a 16GB device. ↩
Never, ever pre-order games. That’s the general rule, especially given such recent debacles as the PC version of Arkham Knight.
But as with every rule, there are exceptions. The final entry in the Starcraft 2 series is – for me – one of them. Within 20 minutes of learning the pre-order for the digital editions was live, I had the deluxe edition ordered.
Starcraft is one of those rare gaming series I hold dear; I played the original on my very first PC, fell in love with the story, then had to wait over a decade until Starcraft 2 came along. When the first part, Wings of Liberty, arrived 12 years after Starcraft 1, I fell in love all over again, and all fears about how splitting up the game into 3 would work out. The 3 year wait for Heart of the Swarm was agonising — I loved the story being told so much I just wanted more!
Now we’re nearing the end, at last. The final chapter in the Starcraft saga will arrive sometime between now and March, and I can’t wait.
I have none, because we haven’t seen enough full information – in context – to make any informed opinions.
And neither have you. I get it, change is scary. But stop whining on the internet about AoS before you have all the information. Please? It’ll make the transition much more pleasant for you, me, and everyone else.
I’m flabbergasted by how quickly it all went from “ok, this looks like it could be fun and interesting,” to “ZOMG! The sky is falling! F-you GW! This is the most ridiculous and crappy game EEEEHHHVVAR!”
And it hasn’t even been officially revealed yet. Careful; your knee is jerking so hard you might do yourself an injury.
I do have one final, parting thought to leave you with:
If you want a balanced, tournament-friendly (and 1st-party supported!) Fantasy massed-battle game that plays like a “Warhammer 9th” – basically what everyone complaining the loudest seems to be lamenting Age of Sigmar is not – then I humbly suggest you go check out Kings of War. 2nd Edition is right around the corner, with the beta rules available for free download. A number of Warhammer Fantasy armies port over to KoW with little-to-no modification or need to buy new models. It’s fast, deceptively simple, fun, well written, and actively supported. If you’re up in arms about AoS, it wouldn’t hurt to check it out.
In which I play the final boss fight of Hearthstone’s Black Rock Mountain expansion. Badly.
[This is my first time trying this game streaming/recording malarkey, so a technical error means there’s no commentary track on this video]
I mentioned a few weeks back I was considering my choices for how to upgrade my aging computer equipment, and of the choices, building my own custom PC would be the most rewarding path to take. I swithered a bit on whether I really wanted to do this, but in the end I gave in to the temptation to build something entirely my own.
Great, I know what I want to do, now how do I get there? It’s been several years since I built a PC1, and I haven’t been keeping up with the trends, or what’s the latest and greatest in terms of performance, price, or anything really.
I had a few ideas of what I wanted – it needed to be small, as space in the office is at a premium. It needed to be as powerful as I could afford, so it would last a decent amount of time until it needed major upgrades, while being flexible enough to tackle many different types of task – development, gaming, photo (and potentially basic video/audio) editing, for example. In a perfect world, I wanted it to be as quiet as possible and look good.
The last few weeks have been spent doing research, going back and forward over potential configurations using PC Part Picker before settling on an outline of what I wanted. I took it over to /r/BuildAPC for a sense check, and was told my best bet was to change the graphics card for something more powerful than I had picked out. I rejigged a few things to make that possible, and ended up with the spec below:
The graphics card might still be swapped for another, similarly specced one, but otherwise this is what I’ll be building in a little over a week’s time, when I have some time off. I’ll be talking more about the build, closer to the time, as I have a few things planned which will make it a bit more interesting than just a straight PC build
- It was in 2008. I checked my order history. ↩
Over the last couple of weeks, my iPhone 5S has been rebooting itself during the night. Once (last Saturday) it got stuck in a reboot loop on the Apple logo screen. Strangely, it seemed to be emitting some kind of tone every time it restarted… maybe that was my woken-at-3am brain imagining things, but I’m sure it also made a noise in the early hours of this morning when it rebooted.
The most annoying thing about this, is that it’s only happening at night, while I’m asleep. I know it’s happening because my lock screen tells me so, and I can’t use TouchID to unlock the phone. That, and the fact the display flashing up the stark white loading screen sometimes wakes me up. Throughout the day, everything appears fine. It’s really quite bizarre.
I’d reset the phone to factory settings, but there are a couple of security-related apps installed which would be a massive PITA to have to de-authorise and set up again.
Has anyone else experienced this?
So after the saga which was getting rid of a legacy Google Apps service on my “main” Google Account, I had to create a new one. I thought it worth sharing the experience of this, to round out the story.
At first I setup the account with my primary email, with no need for a GMail account. This was straightforward enough, and I was able to setup 2-factor authentication, Chrome browser sync, and re-setup some of the mobile applications.
I had to re-verify my account via a SMS code for a number of services, even after setting up 2FA, which is no big deal really, but you’d think once would be enough.
After a couple of hours, I decided I wanted to make use of Google Now. To get the most out of it I would need to use GMail. No big deal, I can setup some forwarding and aliases to integrate this with Fastmail1. Finding a usable address was the hard part. I must have spent a good 30-40 minutes coming up with and trying addresses only to find they were already taken. A “email@example.com” address would have worked, because I was setting up forwarding, but who actually wants an address like that? Completing the GMail setup brought a message that while my primary email on the account had now changed, I would still be able to login with the email address I had signed up with.
The Google mobile apps have a nice feature where, if you’re logged into one of them, it recognises your details and you can quickly login to the others. These all worked a treat, except the GMail account. The GMail app picked up that I was logged in to other apps with to my account fine, but trying to login kept coming back with an error. In the end I had to “login as someone else” and login with the GMail address + password instead of the other email address I was using as my login. A minor annoyance, and one which might catch some less technical people out.
YouTube was normal enough, but wanted me to choose between posting as myself, or creating a new Google+ page to post as. I don’t post to YouTube, and I wanted to keep my account as simple/clean as possible, so I went with myself even though I had a slight unease about it. If I did ever start a “channel” I’d probably want to call it something else, but that’s a problem for another day.
I feel a little better about my account situation now. Overall, Google’s account system is a deep rabbit warren of inconsistant WTF-ery, but once you make headway and get some of the key services setup, you should be more or less set.
- Post still to come ↩
Earlier on I was trying to find a way to “downgrade” a Google Apps account to a personal account. Well, I found a way. Kinda. Ok, not really – I slipped up and deleted my Google account.
I was a bit naive about what removing a Google Apps subscription entailed. In the absence of any clear documentation, I
assumed it would remove the baggage of Google Apps, leaving me with a normal Google personal account (especially as the account predated Apps). It didn’t actually remove Google Apps… but it did remove my access to pretty much every useful Google service. I was locked out of Drive/Docs, Browser Sync… everything I use on a regular basis.
It turns out, that if you want to delete Google Apps, cancelling your subscription is only a partial measure. Whereas in most services “cancel subscription” means “I’m done, so remove all my stuff and let me go” if you want to cancel Apps then you have to cancel, and then do the non-obvious step of explicitly deleting your domain from the service.
At this point, my choice was: buy a new subscription to Apps, putting me back to square one – only paying for it, or completely delete everything to do with the Apps account. So deletion it was.
Eventually I tracked down where in the mess that is the Apps admin area I could find the delete domain button, held my breath, and clicked.
Milliseconds later I was dumped out of Google Apps, and everything was gone. Everything. Even the stuff you’d forgot about, like your Google+ profile, oAuth logins to other sites or logins on other devices, and accounts you forgot were merged, i.e. my YouTube account and subscriptions. My iPhone complained, WordPress complained, Feedly complained, Chrome complained, and so did many, many more! Years of settings, data, and integrations, gone in a button click.
Immediately I had a wave of regret, but also a slight sense of a weight being lifted. I no longer had to worry about the schizophrenic nature of my old account. If I wanted to try a new Google service, I didn’t have to wait for it to be Apps-enabled. Yes, a whole bunch of data was gone, but in a way, that was good. I would be starting over from scratch, without all the cruft that had accumulated over the many years.
So I guess it’s not that bad, really. Just a little inconvenient in the short-term. I’ve created a new account, relinked any complaining devices, and generally started rebuilding.
But please, Google, make the whole Apps/Account integration more user-friendly!
I like to think of myself as generally a smart person. I have my weaknesses, but I’m usually pretty good at figuring something out – particularly if it’s tech related. Problem solving is generally one of my strong points.
So why, oh why, can I not figure out how to “downgrade” or migrate a Google Apps account to a “normal” Google account?
For background, I have a legacy Google Apps account, from when I used to run my own-domain email account through the service. I switched to Fastmail a couple of years ago, but by this point the Apps account was my “main” Google account – the one I was logged into all the time and thus had my data attached to.
I wanted to get rid of the Apps part of the account, as it causes some weird issues now and again, doesn’t work with all Google services, and I don’t use it for the intended purpose any more.
But it’s increasingly looking like this might not be possible. I can think of a number of enterprise-y reasons why not, but I can also think of a few use cases where it should be possible to at least allow it. I’ll keep hunting for now.
Note: I found this mini How-To while having a clean-up of my GitHub repositories. I figured it would be worth sharing on my blog. Hopefully it is of use to someone. Warning: bad ASCII art ahead!
- I have my repository hosted on GitHub
- I have an internal Git server used for deployments
- I want to keep these synchronised using my normal workflow
Both methods I’ll describe need a “bare” version of the GitHub repository on your internal server. This worked best for me:
cd ~/projects/repo-sync-test/ scp -r .git user@internalserver:/path/to/sync.git
Here, I’m changing to my local working directory, then using
scp to copy the .git folder to the internal server over
More information and examples this can be found in the online Git Book:
Once the internal server version of the repository is ready, we can begin!
The Easy, Safe, But Manual Method:
+---------+ +----------+ /------> | GitHub | | internal | -- deploy --> +---------+ +----------+ \------> ^ ^ | | | +---------+ | \-----| ME! | ----/ +---------+
This one I have used before, and is the least complex. It needs the least setup, but doesn’t sync the two repositories automatically. Essentially we are going to add a second Git Remote to the local copy, and push to both servers in our workflow:
In your own local copy of the repository, checked out from GitHub, add a new remote a bit like this:
git remote add internal user@internalserver:/path/to/sync.git
This guide on help.github.com has a bit more information about adding Remotes.
You can change the remote name of “internal” to whatever you want. You could also rename the remote which points to GitHub (“origin”) to something else, so it’s clearer where it is pushing to:
git remote rename origin github
With your remotes ready, to keep the servers in sync you push to both of them, one after the other:
git push github master git push internal master
- Pros: Really simple
- Cons: It’s a little more typing when pushing changes
The Automated Way:
+---------+ +----------+ /------> | GitHub | ======> | internal | -- deploy --> +---------+ +----------+ \------> ^ | | +---------+ \------------- | ME! | +---------+
The previous method is simple and reliable, but it doesn’t really scale that well. Wouldn’t it be nice if the internal server did the extra work?
The main thing to be aware of with this method is that you wouldn’t be able to push directly to your internal server – if you did, then the changes would be overwritten by the process I’ll describe.
One problem I had in setting this up initially, is the local repositories on my PC are cloned from GitHub over SSH, which would require a lot more setup to allow the server to fetch from GitHub without any interaction. So what I did was remove the existing remote, and add a new one pointing to the https link:
(on the internal server) cd /path/to/repository.git git remote rm origin git remote add origin https://github.com/chrismcabz/repo-syncing-test.git git fetch origin
You might not have to do this, but I did, so best to mention it!
At this point, you can test everything is working OK. Create or modify a file in your local copy, and push it to GitHub. On your internal server, do a
git fetch origin to sync the change down to the server repository. Now, if you were to try and do a normal
git merge origin at this point, it would fail, because we’re in a “bare” repository. If we were to clone the server repository to another machine, it would reflect the previous commit.
Instead, to see our changes reflected, we can use
git reset (I’ve included example output messages):
git reset refs/remotes/origin/master Unstaged changes after reset: M LICENSE M README.md M testfile1.txt M testfile2.txt M testfile3.txt
Now if we were to clone the internal server’s repository, it would be fully up to date with the repository on GitHub. Great! But so far it’s still a manual process, so lets add a
cron task to stop the need for human intervention.
In my case, adding a new file to
/etc/cron.d/, with the contents below was enough:
*/30 * * * * user cd /path/to/sync.git && git fetch origin && git reset refs/remotes/origin/master > /dev/null
What this does is tell cron that every 30 minutes it should run our command as the user user. Stepping through the command, we’re asking to:
cdto our repository
git fetchfrom GitHub
git resetlike we did in our test above, while sending the messages to
That should be all we need to do! Our internal server will keep itself up-to-date with our GitHub repository automatically.
- Pros: It’s automated; only need to push changes to one server.
- Cons: If someone mistakenly pushes to the internal server, their changes will be overwritten
And with only one input controller as well! Both games are setup next to each other, and are taking input simultaneously from the same controller.
Mega Man X is probably in my Top-10 all-time favourite games – and compared to modern games – is really unforgiving. I can’t even fathom being able to do something like this.
I’m in the market for a new computer1, but I have no idea what way to go. I’ve been making do with older kit for the last few years, but all of it is pretty much at the end of its usable life.
I recently set up a new “office” area in the house, and the way I did it allows me to swap between my work-supplied laptop, and a computer of my own, just by plugging into the right monitor input and swapping a USB cable. This setup also allows my son to make use of the desk if he needs to.
Until recently, the computer I used most around the house was a 9 year old Dell Latitude laptop which I had made usable by putting an SSD into it, and building a lightweight Arch Linux installation. This was primarily because a laptop was all I had space for. Actually, I tell a lie – the “computer” I use most is my iPhone, but for times the iPhone can’t cut it (for whatever reason) I used the Dell2. While this arrangement worked, it showed its age, and it was fiddly at times.
I’ve had a 6 year old Mac Mini lying around for a while, doing nothing. It’s only barely more powerful than the Dell3, and the one time I had it plugged into the living room TV, it was just plain awkward to use. With the new office I was able to plug it in to a proper monitor/keyboard/mouse arrangement which made it more viable. So this past weekend I took the SSD from the Dell, put it in the Mac, and made that my “home computer.” It’s just fast enough to not induce rage when trying to do anything more taxing than surf the web and other light duties.
Now I’ve got a “proper” desk and space, I’ve been thinking I should look getting something which will last me another few years. The cheapest upgrade I could do is to spend ~£60 and double the RAM in the Mac Mini, going from 4GB to 8GB. I’m sure that will give a noticable boost to OS X, but it doesn’t really change the fact the system is on borrowed time. It could buy me another 6-12 months, but at some point, likely soon, something is going to fail. The way I see it, my choices are:
- Buy a newer Mac, probably a laptop for flexibility (plus that’s where all their non-iOS/Watch innovation seems to be going).
- Buy a Windows laptop.
- Build a custom PC.
Of the choices, #3 is likely the most satisfying, and would have the most upgrade potential further down the line, though I would be constrained later by choices I made now. It also has the potential to get very expensive; I priced up a high-end Mini-ITX system for a bit of fun, and it came to roughly £1000 before choosing a suitable graphics card. I could definitely price something for less, and would probably have to, but it would have to be weighed against longevity of usable performance and upgradability. I am a little space constrained, so a massive tower is never going to be practical, but there are plenty options between Mini-ITX and mATX nowadays.
A Windows laptop feels like it would be a cop-out, and there’s not much out there I feel inspired enough to part with my money for. There’s a couple of nice laptops I’ve seen4, but none I feel would last as long as I’d like them to.
Getting a new Mac has been the direction I’ve been leaning towards for a while, but I’ve always struggled to justify it vs. other spending priorities. Plus, when you factor in how fast Apple iterate their hardware, the lack up after-sale upgradability, and you’re always hoping to “time it right”. That said, as an iPhone/iPad owner there’s a lot of upside to getting a Mac, for example: close integration through Handover/Continuity (granted, which I can’t currently use with the Mini), and iCloud Photo Library. I guess I could set up something more “cross-platform” for the photo library, using Dropbox, but I found Apple’s solution to be that little bit nicer to work with.
So the jist of this much-longer-than-I-planned stream of consciousness is that I need to start thinking about replacing the old and almost busted computer kit I have with something new. I don’t know what that will be yet, and I’d hoped getting my thoughts out would help me focus my mind on what I want to do.
No such luck though. Any ideas?
- Anyone who knows me probably knows I’ve actually been talking about it for ~4 years. ↩
- And what of my iPad? I mainly just use it for Hearthstone and Games Workshop rulebooks. Since iOS 8 (I think), my iPad has taken a huge hit in performance, and just isn’t as capable as it once felt. ↩
- On paper, at least. In practice it was severely hamstrung by the old-school HDD and running OS X. ↩
- My work laptop is quite nice; it’s a Dell Ultrabook, thin, light, and performant enough. But the consumer pricing is higher than I’d value it at. ↩
With all the cool new stuff constantly being released by recently, it can be very easy to end up with a large hobby backlog. When this happens it’s possible to get overwhelmed by your “to do list,” and it starts to become a mental drag; when this kicks in, your hobby no longer feels fun and instead feels like working a job you hate. Sometimes it’s just best to declare something a lost cause and just start over afresh.
I went through this very recently. My backlog had grown too big for me to see sight of the end of it – especially with the glacial pace I paint at! When I took stock of what was in the queue I had 2 full armies: a jump-heavy Flesh Tearers list, and a mechanised Tempestus Scions list. Not counting fun stuff like vehicles and characters, I had well over 100 models to prepare, assemble and paint… and these are just the army projects! Throw in various starter boxes for other games, and other sundry small projects, and the list was nearer 400.
Too. Damn. Many.
What to do? My initial plan was to freeze buying anything new until I’d whittled the backlog down to a more manageable level. Such a sensible plan might work for many a struggling hobbyist, butnfortunately, it was not the right plan for me. Despite several months of not buying any new figures1, I made zero impact on the pile of miniatures I had to work through. On top of that, I found myself losing all inspiration for certain projects. Some of that came down to gnawing insecurities about being able to achieve the vision I had in my head, others from indecision about what that vision even was any more. In the end there was just a pile of boxes and sprues causing me to feel terrible every time I thought about it. This was no longer a hobby, it was a chore. Something had to give, and it would be great if it wasn’t me.
In the tech world, there’s a popular approach to email management called Inbox Zero. The idea is to have your email inbox as empty as possible, so the amount of time your brain is occupied by email is as close to zero as possible. The intention is to reduce the distraction and stress caused by an overwhelmingly full inbox. Related to Inbox Zero, is Email Bankruptcy – the practice of deleting all email older than a certain date (often that day) due to being completely overwhelmed.
One day I realised I needed to declare something similar – Hobby Bankruptcy – or I was going to drive myself out of a hobby I’ve loved for over 20 years.
How was I going to do this? Throwing out hundreds2 of pounds of miniatures would be insane, especially if I changed my mind about something. Selling would take too long, and was subject to the fickleness of others. The simplest (non-destructive) solution won out: I took everything 40K/WHFB related, and stashed it in the loft. Out of sight; out of mind. Literally. The only survivors of the “purge” were source books and the limited edition 25th anniversary Crimson Fists miniature.
I can’t express just how much of a weight off doing this has been. I’m no longer under (self imposed) pressure to work through a massive backlog I no longer had the enthusiasm for, and yet, if I rediscover that enthusiasm, I can pull individual kits from the loft to work on as and when I want to.
In the meantime though, I am free to start work on new projects3…
And yes, I do know I’m crazy.