Politics

The government's approach to encryption could make us more vulnerable

It's impossible to create a backdoor that only the "good guys" can use

June 05, 2017
After WhatsApp was used by the Westminster attacker, Amber Rudd vowed to take on encryption. Photo: PA
After WhatsApp was used by the Westminster attacker, Amber Rudd vowed to take on encryption. Photo: PA

From pig Latin to the complex mathematics of today's computer encryption, encoding communications is as old as humanity. Often, as with Alan Turing's work in World War II, cracking the enemy's codes has conferred crucial military advantage.

Because the internet was designed to share, rather than secure, information, encryption plays several important roles in today's digitised landscape. It ensures that sensitive data can't be read by unauthorised people: when a healthcare manager forgets the clinic's laptop in a taxi, a criminal steals a company's usernames and passwords, or a consumer sends credit card details to an online retailer, encryption protects the data against interlopers.

Encryption also provides a way to check that digital files—from the software programs that run your car's braking system to medical images and electronic payments— haven't been tampered with.

Around 1990, three interrelated developments coalesced to disrupt the policies that govern encryption. The first was the culmination of two decades during which there had been growing adoption of computers and computer networks. Second, cryptographers began working outside the military—in academia and commercial companies. Third, computing plummeted in cost—while escalating in power.

In 1991, a programmer named Phil Zimmermann wrote and released the free program Pretty Good Privacy that for the first time offered individuals government-strength encryption for their email and stored computer data.

But by the mid-1990s, a steady stream of government spokespeople was arguing that the masses should not be allowed to use encryption without storing a copy of their decryption key for government access ("key escrow").

The threat of drug dealers, organised crime, paedophiles, and terrorists was invoked so often they became known as "The Four Horsemen of the Infocalypse".

A steady stream of mathematicians and security experts countered that key escrow would create a tempting target for cyber criminals, while enabling covert surveillance of the general population. In Britain, these "First Crypto Wars" ended with the dropping of key escrow from the Regulation of Investigatory Powers Act (2000). Instead, the act gave the authorities the power to compel suspects to produce their encryption keys or face jail time. To key escrow's opponents, it appeared that common sense, and pragmatic aspirations for the digital economy, had prevailed.

For every terrorist who uses encryption there are millions of innocent people who need it for legitimate reasons
Today's revived battle—the “Second Crypto Wars”—began soon after Edward Snowden's 2013 revelation that the US National Security Agency and GCHQ had deliberately weakened cryptographic software by infiltrating standards organisations and introducing hidden flaws. The broader technology community, now expanded to include large technology companies that didn't exist in the 1990s, responded with campaigns such as Encrypt All the Things and Let's Encrypt, and with secure messaging systems that even their owners cannot read (using "end-to-end encryption")—Facebook's WhatsApp being one example.

Less than two months ago, news that the Westminster attacker had sent WhatsApp messages three minutes before driving onto the bridge led Home Secretary Amber Rudd to say, "You can’t have a situation where warranted information is needed, perhaps to stop attacks like the one last week, and it can’t be accessed."

She was speaking with the benefit of hindsight. Just as is the case with vans and mobile phones, for every terrorist who uses encryption there are millions of innocent people who need it for legitimate reasons.

You may feel comfortable giving the British police access to your WhatsApp messages. Would you feel as comfortable if the encryption that the government can unlock disables your van because someone believes you might become an attacker?

In the 2015 paper "Keys under Doormats", 15 security experts, including Cambridge University's Ross Anderson, outlined the extensive damage such a policy would cause.

Current work to improve internet security would be reversed, leaving us even more vulnerable to the burgeoning levels of cyber crime. Systems would become more complex and therefore more prone to unpredictable failures.

After last week’s attack in Manchester, Rudd again appeared on the Andrew Marr Show, saying that she understood the importance of encryption, but that “we need to do better to stop terrorists using it.”

This poses further problems. As "Keys under Doormats" explains, backdoors themselves would be targets for criminals: there is no such thing as a hole that only "good guys" can use.

Rapid, automated access, as per Rudd's apparent suggestion, would remove the checks and balances that human rights law requires for government searches, affecting millions of innocent people.

Finally, government access raises huge jurisdictional questions. Which of the world's 220 countries should have access? Under what circumstances, and subject to whose controls? And how should those decisions be enforced?

Targeted surveillance of suspects is a necessary power to counter genuine threats. But damaging the entire internet infrastructure has massive social, technical, and economic costs. In an interconnected society, we are all only as safe and secure as the weakest node allows us to be.