The More Things Change the More They Stay the Same

One of the first issues I had to deal with when I joined the Commerce Department in 1993 involved surveillance and encryption. U.S. intelligence and law enforcement authorities wanted access to telephone conversations of selected individuals—foreign for the intelligence folks and domestic for the law enforcement folks—and they proposed that companies making phones equipped with encryption capabilities be required to build in a “back door” that would allow the government access.

There was widespread opposition to that inside the government, from technology and telecommunications companies and privacy organizations, and it did not happen. It did not, however, go away, and there have been rumors from time to time that its proponents would like to try again. That is no more likely to happen now than it was then, and if that is all this column was about, it would be a short one.

Instead, it is an opportunity to reflect briefly on two of the fundamental issues raised at the intersection of privacy, security, and technology. These issues were fought out in the late 90s in what was known as encryption war 1.0 over how to apply export controls to encryption technology, and they are still being fought today in the 2.0 version of the war. It has also come up as companies rolled out end-to-end encryption, which allows people to send encrypted messages that could not be opened by the service provider, allegedly making it impossible for law enforcement to force providers to make unencrypted versions of their users’ messages available.

The same issue is being debated right now in the United Kingdom, as parliament considers legislation that companies fear will give the government the authority to require messaging platforms to be capable of looking at users’ messages to scan for illegal content, usually identified as relating to child sexual abuse material (CSAM) or terrorism. Imposing that requirement would effectively prohibit end-to-end encryption.

There is some disagreement over how far the bill—which has not finished making its way through the legislative process—would actually go, but the fundamental debate is over security versus privacy. The government argues that citizens need to be protected from certain content and the state needs to be able to identify activities that threaten it, while privacy proponents and free speech activists point out that blocking end-to-end encryption potentially eliminates privacy for everyone, and that objectionable content is in the eye of the beholder. While there may be a consensus on preventing the dissemination of CSAM, the government could easily move to expand its reach to subjects where there is no consensus, with the result being a censorship regime that is the first step towards authoritarianism.

Here in the United States, we are wrestling with the same issues—surveillance rights for law enforcement and intelligence forces and content moderation policies for content deemed objectionable. While the United Kingdom has conflated the two in its Online Safety Bill, they are proceeding separately here.

The content moderation issue has taken a different turn here, with debate focusing on Section 230 of the Communications Decency Act of 1996, which protects platforms from liability for the content posted on them by other people. The EARN IT Act, which did not make it across the finish line last year but is being set up for another try, would withdraw Section 230 protection from platforms that don’t properly police CSAM content that appears on their websites. Other bills take a similar approach. The opposition, as in the past, comes from the tech companies, privacy groups and also LGBTQ+ organizations that are concerned the bill would prevent apps that offer end-to-end encryption from using that as a defense for not finding and removing CSAM content and thus expose them to criminal or civil liability.

The surveillance issue lurks behind the continuing efforts of the United States and the European Union to agree on a framework that meets EU privacy standards. This is the third try, the first two having been rejected by EU courts, which concluded that they did not adequately protect EU citizens’ private information from potentially being accessed by the U.S. government. U.S. law enforcement and intelligence authorities have not been willing to simply promise not to do that, and efforts to set up an adjudication process for European citizens thus far have not passed muster. We will see if the latest version fares better.

These are difficult issues that pose a contradiction for me—it’s taken me three weeks to figure out how to finish this column. On the one hand, I have thought for a long time that there is, in fact, no real privacy in the United States. Between the information you provide voluntarily when you buy something, what is recorded without you knowing when you visit a website, and what is illegally acquired by hackers, real privacy is an illusion. That suggests to me that the European Union is expecting too much from its General Data Protection Regulation (GDPR), that its demands for the United States are also unrealistic, and that we therefore are spending a lot of time arguing about something that may not actually matter.

On the other hand, end-to-end encryption may prove me wrong. It seems to promise real privacy, at least until someone figures out how to break it, so we ought not to dispense with it lightly even if it makes it more difficult to deal with objectionable content.

William Reinsch holds the Scholl Chair in International Business at the Center for Strategic and International Studies in Washington, D.C.