As part of the Global Conference on Cyberspace, the Netherlands will facilitate an expert panel on the right to privacy, one of the central themes of the conference. During this panel, high esteemed privacy and security experts such as Bruce Schneier and Sir David Omand will discuss two questions: What will privacy look like in five years? and How to balance privacy and security?. To inspire this debate, the Ministry of Foreign Affairs of the Netherlands commissioned two thought provoking pieces, one on each of these questions, to the award winning Dutch data journalists Maurits Martijn and Dimitri Tokmetzis. As such, these pieces do not reflect the position of the Dutch government. The second of these papers, How to protect privacy and security in the Crypto Wars by Dimitri Tokmetzi, can be found below. For further discussion on this topic, follow the debate on Friday 17 April 2015 at 11:15 CET (livestream available). * * * How to protect privacy and security in the Crypto Wars We thought that the Crypto Wars of the nineties were over, but renewed fighting has erupted since the Snowden revelations. On one side, law enforcement and intelligence agencies are afraid that broader use of encryption on the Internet will make their work harder or even impossible. On the other, security experts and activists argue that installing backdoors will make everyone unsafe. Is it possible to find some middle ground between these two positions? This is the story of how a handful of cryptographers hacked the NSA. It s also a story of encryption backdoors, and why they never quite work out the way you want them to. So began the blog post on the FREAK attack, one of the most ironic hacks of recent years. Matthew Green, assistant professor at John Hopkins university, and a couple of international colleagues exploited a nasty bug on the servers that host the NSA website. By forcing the servers to use an old, almost forgotten and weak type of encryption which they were able to crack within a few hours, they managed to gain access to the backend of the NSA website, making it possible for them to alter its content. Worse still, the cryptographers found that the same weak encryption was used on a third of the 14 million other websites they scanned. For instance, if they had wanted to, they could have gained access to whitehouse.gov or tips.fbi.gov. Many smartphone apps turned out to be vulnerable as well. The irony is this: this weak encryption was deliberately designed for software products exported from the US in the nineties. The NSA wanted to snoop on foreign governments and companies if necessary and pushed for a weakening of encryption. This weakened encryption somehow found its way back onto the servers of US companies and government agencies. Since the NSA was the organization that demanded export-grade crypto, it s only fitting that they should be the first site affected by this vulnerability, Green gleefully wrote. The FREAK attack wasn t only a show of technological prowess, but also a political statement. Ever since Edward Snowden released the NSA files in June 2013, a new battle has been raging between computer security experts and civil liberties activists on one side and law enforcement and intelligence agencies on the other. 1
There was one set of revelations that particularly enraged the security community. In September 2013 the New York Times, ProPublica and the Guardian published a story on the thorough and persistent efforts of the NSA and its British counterpart GCHQ to decrypt Internet traffic and databases. In a prolonged, multi-billion operation dubbed BULLRUN, the intelligence agencies used supercomputers to crack encryption, asked, persuaded or cajoled telecom and web companies to build backdoors into their equipment and software, used their influence to plant weaknesses in cryptographic standards and simply stole encryption keys from individuals and companies. A war is looming But security specialists argue that by attacking the encryption infrastructure of the Internet, the intelligence agencies have made us all less safe. Terrorists and paedophiles may use encryption to protect themselves when planning and committing terrible crimes, but the Internet as a whole cannot function without proper encryption. Governments cannot provide digital services to their citizens if they cannot use safe networks. Banks and financial institutions must be able to communicate data over secure channels. Online shops need to be able to process payments safely. And all companies and institutions have to keep criminals and hackers out of their systems. Without strong encryption, trust cannot exist online. Cryptographers have vowed to fight back. Major web companies like Google and Yahoo! promised their clients strong end-to-end encryption for email and vowed to improve the security of their networks and databases. Apple developed a new operating system that encrypted all content on the new iphone by default. And hackers started developing web applications and hardware with strong, more user-friendly encryption. In the past few years we have seen the launch of encrypted social media (Twister), smartphones (Blackphone), chat software (Cryptocat), cloud storage (Boxcryptor), file sharing tools (Peerio) and secure phone and SMS apps (TextSecure and Signal). This worries governments. In the wake of the attack on Charlie Hebdo in Paris, UK Prime Minister David Cameron implied that encryption on certain types of communication services should be banned. In the US, FBI director James Comey recently warned that the intelligence agencies are going dark because of the emergence of default encryption settings on devices and in web applications. In Europe, the US and elsewhere politicians are proposing that mandatory backdoors be incorporated in hardware and software. Some even want governments to hold golden keys that can decrypt all Internet traffic. The obvious question is how we can meet the needs of all concerned? One the one hand, how can we ensure that intelligence and law enforcement agencies have access to communications and data when they have a legal mandate to do so? Their needs are often legitimate. One the other, how can we ensure strong data protection for all, not only a techsavvy few? As we shall see, this crypto conflict isn t new, nor is the obvious question the right question to ask at this moment. Crypto to the people Up until the seventies, the use of cryptography was limited to governments, big corporations and some math enthusiasts. With the rise of electronic networks like the Internet, the demand for encryption grew. Academics started to develop new cryptography methods, but were warned by intelligence agencies to refrain from publishing about them, according to Bart Preneel, a long-time professor of cryptography at the Belgian University of Leuven. The 2
first encryption products were built into hardware and exporting them was prohibited by most countries. These export controls were outdated the moment encryption became available in software products in the late eighties, Preneel says. Phil Zimmermann developed his encryption product Pretty Good Privacy (PGP) that made it fairly simple to encrypt email traffic. Once uploaded onto the Internet, there was no stopping it, according to Preneel. The US authorities tried to stop Zimmermann from exporting his code, but PGP had already found its way onto the nascent network. Zimmermann also published the raw code in a book, making the export of his work a free speech issue. The Clinton administration still tried to force a backdoor to be incorporated in USmanufactured hardware, but this Clipper Chip proved to be unsafe and too contentious. Export controls were subsequently relaxed. The same happened on the other side of the Atlantic. In 1995 the Wassenaar Arrangement was signed, restricting the export of cryptography and many other products. In 2000 these restrictions were lifted. Strong democratised encryption was unstoppable. Preneel said: We thought we had won the war. We turned out to be wrong. The Crypto War was lost If anything, the proponents of strong encryption had probably lost the war. However, the war is certainly not over as far as FBI director James Comey is concerned. In a speech at the Brooking Institution in October 2014 he told the audience that perhaps it s time to suggest that the post-snowden pendulum has swung too far in one direction in a direction of fear and mistrust. Comey thinks that tech companies overreacted to the Snowden revelations. Encryption isn t just a technical feature; it s a marketing pitch. He objected to the term backdoor. We want to use the front door, with clarity and transparency, and with clear guidance provided by law. We are completely comfortable with court orders and legal process front doors that provide the evidence and information we need to investigate crimes and prevent terrorists attacks. These comments by the FBI Director sound legitimate and certainly seem reasonable. But there are at least three objections to installing decryption technology in infrastructure and software. Making everyone less secure The first objection is: who gets to decide who uses a backdoor? The famous cryptographer Bruce Schneier has often warned that modern computer technology is fundamentally democratising. Today's NSA secret techniques are tomorrow's PhD theses and the following day's cybercrime attack tools. In other words, if you install a backdoor, you can never be sure whether or not someone else will find it and use it for nefarious purposes. A strong case in point is the so-called Vodafone hack that was discovered in Athens, Greece in late 2005. A lawful wiretapping device, used by the country s law enforcement agencies, was compromised and more than a hundred people were spied on, possibly for two years prior to the discovery. The culprits remain unknown until this day. The targets were journalists, Arab individuals, senior government and secret service officials and an American embassy worker. Similar major security breaches were discovered in other countries too. Theoretically it might be feasible, as current NSA director Michael Rogers argues, to build a backdoor that only his agency can use. The NSA actually came close to building a very secure backdoor with DUAL_EC_DRBG, the Dual Elliptic Curve Deterministic Random Bit Generator. This piece of software is one of the few international standards used to generate 3
encryption keys. The Snowden files showed that in the early 2000s the NSA exploited a weakness in the code, through which only they could guess the outcome of the generator, and with that knowledge were able to break the widely-used encryption keys. The only problem is that even years before Snowden blew the whistle, cryptographers knew that there was something wrong with the code, but couldn t find definite proof. And the leak shows that even the single most advanced intelligence agency cannot keep its secrets. The real world keeps disproving the theory. Unsound economics The second argument is one of economics. Backdoors can stifle innovation. Even until very recently, communications were a matter for a few big companies, often state-owned. The architecture of their systems changed slowly, so it was relatively cheap and easy to build a wiretapping facility into them. Today thousands of start-ups handle communications in one form or another. And with each new feature these companies provide, the architecture of the systems changes. It would be a big burden for these companies if they had to ensure that governments can always intercept and decrypt their traffic. Backdoors require centralised information flows, but the most exciting innovations are moving in the opposite direction, i.e. towards decentralised services. More and more web services are using peer-to-peer technology through which computers talk directly to one another, without a central point of control. File storage services as well as payment processing and communications services are now being built in this decentralised fashion. It s extremely difficult to wiretap these services. And if you were to force companies to make such wiretapping possible, it would become impossible for these services to continue to exist. A government that imposes backdoors on its tech companies also risks harming their export opportunities. For instance, Huawei the Chinese manufacturer of phones, routers and other network equipment is unable to gain market access in the US because of fears of Chinese backdoors built into its hardware. US companies, especially cloud storage providers, have lost overseas customers due to fears that the NSA or other agencies could access client data. Unilateral demands for backdoors could put companies in a tight spot. Or, as researcher Julian Sanchez of the libertarian Cato Institute says: An iphone that Apple can t unlock when American cops come knocking for good reasons is also an iphone they can t unlock when the Chinese government comes knocking for bad ones. There is no going dark Third and finally, there is no going dark. Law enforcement and intelligence agencies can still intercept and read a lot of data. In fact we are living in what many security commentators call the golden age of surveillance. As more and more of our activities are mediated by technology, we leave a growing digital trail that reveals a great deal about ourselves. Encryption lets you hide the content of messages, but not their context so-called metadata which reveal for instance what you read, who you talk to and where you are. In fact, this kind of data reveals so much that the former NSA director Michael Hayden once boasted: we kill people based on metadata. Furthermore, many governments and private businesses have developed formidable offensive capabilities. Many intelligence and law enforcement agencies are able to hack the end points in systems, i.e. the devices we use. There is a small but growing industry in finding and selling details of software vulnerabilities, so that they can be exploited. Many 4
governments use the services of companies like Gamma International (which offers the FinFisher hacking suite), Hacking Team (Remote Control System) and VUPEN (which sells so called zero days, software vulnerabilities that haven t yet been found by others). With the rise of ubiquitous computing and the Internet of Things, the volume of revealing data streams will only increase, leaving no shortage of data to intercept or devices to hack. The real trade-off? When the balance between security and privacy is viewed from an economic perspective, there seems to be a trade-off. Yes, backdoors are possible, but the price could be higher than we bargained for. Technically it might be feasible to install backdoors that only law enforcement and intelligence agencies can exploit, but the real world of organisations and software implementation might mess up this carefully scripted scenario. Such a scenario would also only play out if governments were to force all communications providers to give them centralised access, something that could severely stifle innovation. And it may transpire that building backdoors requires a crackdown on academic research: you don t want your backdoor to be exposed by nosy professors. But, more importantly, the choice between privacy and security is a false one. The real choice is between mass surveillance and targeted surveillance. It is true that encrypting a great deal of data by default would make legitimate efforts to intercept and decrypt communications a lot harder. But this would mostly be the case for surveillance on a large scale. Targeted surveillance would remain possible. And when it comes to targeted surveillance, there are a number of viable policy options. The first is the decryption order. With a court order, suspects could be forced to decrypt their information. If they refused to comply, they could be sent to jail. Several countries, like France, the UK and the US are already using this mechanism. Others, like the Netherlands, are considering it. Another option is remotely hacking devices. The end points of the Internet, for instance our devices, are the weakest. Recently The Intercept revealed that the NSA stole the encryption keys of Gemalto, a multinational company that produces SIM cards for many phone operators. Law enforcement agencies could use malware to gain access to devices and intercept data before they are encrypted. With robust legal safeguards in place, government hacking could be a viable option. At this stage no one knows how to prevent abuse of malware: once an agency has malware, it is very easy to produce different versions of it and very difficult to control its spread. The final option is increased data retention for all companies and institutions, so that agencies can access historical metadata. But as last year s annulment of the data retention directive by the European Court of Justice showed, mandated storage of data has to be enshrined within a very robust data protection framework. Whatever the outcome of this new Crypto War may be, it s clear that a ceasefire needs to involve all parties and address all legitimate interests. In the end the overriding interest is the same for everybody: a secure and robust Internet. And we need secure and robust cryptography to get there. 5