Recently I was having a conversation about politics, specifically I was talking about why, after listening to countless interviews and reading all of the manifestos, I will not be voting Conservative in this election. It turns out there are many reasons, but in this post I just want to talk about one.

One of the issues that is very close to my heart in the most recent Tory manifesto is the section entitled ‘Prosperity and Security in a digital age’ - point 5 of the manifesto and something with a whole 8 pages devoted to it.

There’s a lot of problems with the Conservative approach to the Internet, but I’m going to focus on one. Theresa May wants to regulate the internet and undermine end-to-end encryption.

End-to-end encryption is the mechanism that is used across the internet to ensure that only the person you are communicating with can read your messages. The poster child that the government are using for this is WhatsApp, but there are many other uses for end-to-end encryption than having private chats with your mates - for example, communicating your personal medical records with your doctor, sending your credit card details to an online shop in order to buy something, or using online banking. You know that little padlock you see in your browser’s address bar when you’re looking at Amazon? That means that the communications between your computer and Amazon are using end-to-end encryption to make them secure.

Have you ever had your credit card details stolen and used online? Or were you impacted by the Playstation Network outage in 2011 where Sony had a data breach that caused them to lose 45 million credit card numbers? It’s a scary, dehumanising experience realising that you’ve been robbed, or had your identity stolen. end-to-end encryption, when implemented correctly, is the core technology that helps to prevent things like this happening.

Theresa naively thinks that she can introduce legislation that will require tech companies to write their software in such a way that will allow certain people to ‘unlock’ all these encrypted messages. She does this because she wishes to prevent ‘safe spaces’ existing where terrorists can communicate. This is a laudable goal - but unfortunately this implementation just cannot work.

Imagine your digital life as your house. Everything about you is in it: all your money and your bank details, your holiday photos from Facebook, your tax returns, that awkward conversation you had with your ex on WhatsApp, a record of every single website you looked at and when, the location data of everywhere you’ve been while carrying your phone. All of this and more are your posessions.

Your house is secured by a strong front door with a good lock, and you have the only key. This is a good place to be. Sure, determined criminals can break the door down, or break in through a window and steal everything, but that’s an exceptional circumstance, you’re safe, as a general rule.

Now imagine that the government had a master key, a key that would unlock everyone’s house in the country. They’ve assured you that only they have access to this key, they’ve assured you that they’re only going to use this key in the interests of keeping you safe. How do you feel about this? Does it make you feel safer? How do you know you can trust them? What happens if they lose the key? What if someone steals it?

Obviously having blind trust in anyone is probably a bad idea. There have been myriad occasions where unscrupulous individuals have abused the trust given to them for their own personal gain. Fiddling expenses and taking bribes to influence policy on several occasions are high profile examples straight off the top of my head. So you probably shouldn’t assume that the people in charge of this master key are honest.

But let’s assume for a minute that they are, that I trust our government, I trust that they’re going to be true to their word and not use this key for nefarious purposes, I trust that they’re going to look after it carefully and never let it get into the hands of criminals or people who wish me harm. You still can’t prevent an enterprising locksmith from buying their own lock, and then working out what that master key is and making their own copy.

This could happen now, you might say, a clever locksmith could pick my lock, or work out how to get a key cut, and then they’d have access to my stuff. This is true, but in the new world they could do this and they’d have access to more than just your stuff, they’d have access to everyone’s stuff.

If your lock gets broken, that’s inconvenient, it’s frustrating, it may even cause you harm, but you can change your lock and the problem is mostly solved. If the master key is copied then all locks are compromised, you can’t change everyone’s lock in the whole country, thats unmanageable and expensive. So in this new world of master keys, it only takes one breach to break the security for everyone in the country, forever.

I’m not worried, you might say, I have nothing to hide (you might think that this is true, realistically it’s a much bigger conversation, but assuming it’s true you still have a fundamental right to privacy - having a shit is a perfectly normal bodily function but I imagine you still close the toilet door, and you probably don’t have sex with the curtains open). I don’t have any illegal possessions in my house but I still wouldn’t go out and leave the door open - I don’t want someone breaking in and selling my stereo to the highest bidder!

All of this risk applies to end-to-end encryption. It only works as long as the keys are unique, and safe and as unbreakable as feasibly possible.

So Theresa May’s plans are a terrible idea, and a very dangerous terrible ineffectual idea for everyone involved. But maybe that’s an acceptable risk if it allows us to prevent atrocities like those that happened recently in Manchester or London?

Unfortunately this is where this plan really falls flat on it’s face. It makes a few glaring assumptions:

  • Terrorists will all buy phones and software from companies in the UK that all follow the regulations and implement the master key.

Facebook and WhatsApp and Apple and Google and the rest aren’t going to deliberately cripple their own encryption and the safety of their users worldwide because of a draconian unworkable policy from the UK gov’t.

There are a couple of things they’ll do, assuming this regulation makes it into law, they’ll either stop selling or distributing their products in the UK (can you imagine the uproar if you couldn’t legally purchase an iPhone or iPad in the UK?!) or they’ll implement specific versions of their product just for the UK market.

What happens if a suspected terrorist buys a phone from overseas on the black market or uses a US Apple account to download WhatsApp from the US app store? Well now they have encryption again, which renders the whole thing pointless. You’ve now got a system where you can monitor the online activity of everyone except the people you intended to monitor!

The govornment haven’t been able to successfully quash the importing of tobacco, drugs, alcohol, guns, counterfeit goods or anything like that so far in history - so it’s bloody unlikely they’re going to be able to stop you getting your hands on a phone from overseas.

  • All software is written by companies who will follow the regulations

This is kind of two points. Firstly, not all companies will be co-erced into following the regulations (see my point above) some companies for whom the UK is a small market may well just not bother and stop distributing in the UK. Secondly it ignores the massive amount of global free and open source software out in the world. Even assuming that you could force companies to comply with this legislation you can’t force it on software that is distributed in it’s readable source code form and has been contributed to and owned by millions of people across the world.

  • People are incapable of writing software

Even if you manage to successfully implement the legislation and all of your suspects have devices that can be read by the government at any time they wish. You can’t prevent people from writing their own encrpytion software - that’s the beauty of a general purpose computer. And if people can write their own encryption software then you’re back to square one. This is the monumental arrogance of assuming that you are smarter than your enemies.

So hopefully if you got this far I have convinced you that Theresa May’s approach to digitial security is at best a draconian, unworkable, naive mess and at worst a deliberate and dangerouse assault on the privacy and rights of every citizen in the UK. We will be another name on the infamous list of countries that are attempting to ban or undermine encryption - joining Russia, Syria, Iran and North Korea.

Mostly I’m writing this because I believe strongly in the ideals of an open internet, and the right to privacy online, and partly I’m writing this because I have a dog in this fight. I work in the tech industry, and I love my job. This country has an excellent and growing industry in it’s tech companies, with fantastic potential - I’ve personally worked in London and Manchester and Canterbury, with many companies doing fantastic things. The current Conservative policies on Brexit, free movenement of people and their dogged determination to undermine online security and privacy are without doubt a direct attack on an entire industry, an industry I love working in and a clear danger to my personal job security, my families ability to earn a living, as well as many other talented workers in this country.

Please help today to kick them out.