The Crypto Wars Are Over

The encryption debate, now in its third decade, still revolves around the issues of what kinds of encryption citizens can use and under what conditions law enforcement agencies can have access to encrypted messages. In the last year, the terms of debate have shifted significantly as an increasing number of individuals have downloaded messaging services like Signal or Telegram that provide end-to-end encryption services. It is too late to reel this back, absent draconian measures that are unlikely to win political support (lacking some horrific event linked to encryption use).

In thinking about this change, we should recognize that the discussion of encryption has achieved a mythological status, in the sense that much of it revolves around myth rather than fact. The myths include assertions that encryption protects privacy and civil liberties, or ensures cybersecurity, or that the government wants back doors. There are varying degrees of truth to these assertions, but none is truly accurate and none is a good guide to policy.

Everyone in government favors the use of encryption given its contributions to data protection. The issue has been what kind of encryption should be allowed and under what conditions government agencies can access encrypted data. The choices are between “end-to-end” or “recoverable” encryption. With recoverable encryption, a third party can provide access to unencrypted data (plaintext) without the user’s involvement or even knowledge. With end-to-end, only the sender and the recipient can easily access the data being transmitted. The attitudes of leading intelligence agencies toward end-to-end encryption suggest that while it is more difficult and expensive to obtain access, it can be done. That attitude is not shared by law enforcement agencies. 

Law enforcement agencies oppose the use of end-to-end encryption because it undercuts their ability to conduct electronic surveillance and because decrypted transcripts are a powerful tool of the prosecution. They prefer recoverable encryption, where if a court agrees, they can serve a warrant requiring the network operator, service, or manufacturer to provide plaintext. Many large companies already use recoverable encryption so they can see what their employees are doing and ensure compliance with regulations, but an increasing number of individuals use messaging services like Signal or Telegram that provide end-to-end encryption services.  

A second issue, given that with money and time, most commercial encryption products and services can be defeated, is whether to allow law enforcement agencies acting in a manner consistent with established legal protection to use any means to gain access to plaintext under warrant. Some privacy advocates want no limits on encryption but many limits on law enforcement’s ability to gain access. This unbalanced approach would not serve the public interest. A policy that allows everyone to use whatever encryption they like while also permitting law enforcement agencies, when authorized by a court, to use any tool or service to get access to data would be best. 

Back Doors

The back door myth dates back to the internet's commercialization in the 1990s. To get the fledgling technology off the ground, the Clinton administration decided it should be unregulated—no taxes, no product or provider liability (hence the debate over Section 230), no privacy protection, and no delays to address security. At the same time, the administration relaxed U.S. restrictions on encryption, hoping that this would encourage its widespread use to protect internet communications and data. Encryption was hard to install at that time, but this changed with easily downloadable apps, which gave everyone access to encryption. Encryption wasn't a problem for law enforcement agencies early on because most people didn’t use it, and societies underestimated online risk. We've since learned that despite its many benefits, the internet too often brings out the worst in human behavior.  

What the FBI wanted was something like the Communications Assistance for Law Enforcement Act (a law requiring telecom providers, as they moved to digital technologies, to ensure that it would still be possible to wiretap phone calls) for the internet—a front door, not a back door. In the 1990s, true back doors were unattractive to agencies because of the concern that opponents like Russia would discover them and circumvent the protection that encryption was supposed to provide. In the early 1990s, the NSA and the FBI proposed an alternative approach that would add a special NSA-designed chip to internet-connected devices. It would encrypt data but also permit "lawful access" (and triple the price of most devices). Clipper Chip was a bad idea. It set the precedent that any government proposal on encryption engenders protests from privacy advocates.

As encryption has become easier to acquire and install, and as people have moved to greater use of smart phones and texting, many have downloaded end-to-end encrypted messaging services. It is unlikely that any effort to restrict access would work now. Some governments can order service providers to block the use of end-to-end messaging services, but the United States (unlike China, Russia or Iran) lacks the authority to do this, and Congress is unlikely to provide such authority. These services make it hard for conventional communications surveillance measures to work, but there are alternative ways to get access to plaintext, albeit more difficult and more expensive. Back doors are often unnecessary, given the many flaws in commercial software and the ability to use alternative techniques to gain access. The recent SolarWinds exploit revealed one long-standing practice of advanced intelligence agencies—capture the updater—and there are other techniques that can be used to compromise a device and provide a workaround to end-to-end encryption.

Privacy and Human Rights

The debate over encryption is often framed as protecting privacy, even though many of the users of end-to-end encryption include criminals, terrorists, and child pornographers. This is the crux of the law enforcement argument: that we are inadvertently protecting the worst among us. Irrespective of that, encryption does not protect privacy. The chief exhibit in making this case is that tech companies shred privacy no matter what kind of encryption you use. Your smart phone is a mobile tracking and surveillance device whether you use end-to-end encryption or not. Some Android phones, for example, track your location, speed and altitude, contacts, calls, emails, or messages sent from the device, along with other data, and share this with private entities. It’s in the Terms of Service you didn’t read. This is built in and cannot be turned off. Many social media websites also collect personal data and have few restrictions on how they can use it. Advocating untrammeled use of encryption is not really intended to protect privacy, but to defeat government surveillance. The heritage of these objections grew out of the anti-war and civil rights movement of the 1960s and 1970s. But it is no longer the 1970s. Strong oversight and constraints on government surveillance have been enacted since then. These rules, safeguards, and transparency mechanisms around surveillance protect the public, and it is in the public interest to permit lawful surveillance, but those who distrust government are strongly opposed to it.

In the United States, the term privacy does not appear in the Constitution. What does appear is a protection against unreasonable search. A reasonable search is one undertaken in accordance with laws passed by Congress and approved by the courts. These laws exist, and surveillance conducted in accordance with them is reasonable. Government agencies cannot have access except when a court issues a warrant. One can of course build products to avoid complying with requests for a reasonable search, and if there is distrust of the government or concern over the commercial consequences in foreign markets of being seen as cooperating with the federal government, this designing out of access can be attractive. 

However, experience shows that encryption does not protect human rights. Using encryption in a police state just ensures special attention from the security services. While they may not be able to read messages, they can tell someone is using encryption and take steps in response. Encryption by itself offers no protection in countries that do not observe the rule of law, since in these countries, devices can be seized or compromised, cameras and bugs can be planted in homes and cars, and encryption users can be violently coerced.     

Rights are a social construct that depend on institutions and rules and the ability to enforce those rules. End-to-end encryption chips away at that ability to enforce rules and protect rights. This is contrary to the libertarian fantasies that often appear in debates over encryption, which argue that individuals can best protect themselves. A determined state with no respect for the rule of law, using untrammeled surveillance, has no trouble quashing individual rights whether or not encryption is being used. Our objective should be to protect a democratic state where the rule of law protects fundamental rights. Libertarian approaches toward technology offer no defense against the erosion of personal liberty in places was the rule of law has been degraded. 

The heart of the problem is trust in government. There is an understandable decline in respect for political leaders and a corresponding growth in distrust and cynicism. But safety in a democracy comes from public institutions with the ability to enforce laws and the public’s ability to oversee and control these institutions. Disagreement over the legitimacy of government is at the core of the encryption debate. But the alternatives to rule of law are either a dictatorial strong man or an enfeebled and feckless democracy. Neither is attractive and both put civil rights at risk.

Nobody Gets Everything They Want

The availability of alternative techniques to circumvent end-to-end encryption offers a potential solution to the problem, albeit an imperfect one. An example of the challenges associated with circumvention comes from the San Bernardino terrorism case, where law enforcement recovered a smart phone that used encryption, but the company that developed it was unwilling to help provide access. Tech companies often worry that appearing to cooperative with U.S. law enforcement may raise concerns among foreign customers, and some are reluctant to take on the legal costs associated with providing assistance.

The counterargument is that just as unwilling car companies were forced by law to install seatbelts to protect the public, tech companies or internet service providers should be mandated by law to maintain an ability to provide access to protect the public. Australia and some EU member states have explored legislation to do this, but it seems unlikely that the United States will match them. An international agreement to constrain encryption might change this (the United States approached allies about such an agreement 20 years ago but did not pursue it), but even with international consensus there would be a difficult political battle in the United States.       

In the San Bernardino case, the FBI ultimately turned to a tech company that specializes in defeating encryption and was able to access the encrypted data without the manufacturer’s assistance. This led, perversely, to calls that law enforcement agencies be forbidden from using such decryption services. These techniques are more expensive than conventional interceptions, meaning they will not be used at scale and the risk of allowing their use under court approval is negligible. While there are legitimate concerns about the use of third-party surveillance services by private individuals or non-democratic states, in places like the United States where court approval is required for access, forbidding the use of these service makes no sense.     

A pragmatic compromise on encryption would balance safety mandates for encryption providers with new safeguards and oversight for law enforcement. While the likelihood of Congress agreeing to this compromise is low, inaction preserves both public access to end-to end encryption and law enforcement agencies’ access to the tools that might defeat it. A vocal minority opposes any compromise. Many remain unconvinced of the risks of end-to-end encryption. However, in a worst-case scenario, we might quickly discard the scruples about banning end-to-end encryption, or at least the ability to defeat end-to-end encryption. U.S. citizens have no real substitute for government to protect them. If you have lived in a country where the rule of law is a sham or non-existent, you understand why this is so.

Let's agree that everyone should use encryption, but in ways that truly protect civil liberties by maintaining an ability to enforce the law against crime, terrorism, or sedition. The alternative to a reasonable compromise on encryption is living in a digital state of nature. Arguing for a sacrosanct right to use encryption without allowing for law enforcement recourse suggests that individual liberties should take precedence over the larger concerns of the community. This is not what the framers of the Constitution intended. They intended to create a reasonable balance between rights and responsibilities. The events of January 6 make this balanced approach in encryption and in other areas even more essential.    

James Andrew Lewis is a senior vice president and director of the Strategic Technologies Program at the Center for Strategic and International Studies (CSIS) in Washington, D.C.

Commentary is produced by the Center for Strategic and International Studies (CSIS), a private, tax-exempt institution focusing on international public policy issues. Its research is nonpartisan and nonproprietary. CSIS does not take specific policy positions. Accordingly, all views, positions, and conclusions expressed in this publication should be understood to be solely those of the author(s).

© 2021 by the Center for Strategic and International Studies. All rights reserved.

James Andrew Lewis
Senior Vice President; Pritzker Chair; and Director, Strategic Technologies Program