A brief history of the crypto wars
Asymmetric encryption emerged in the 1970s against the backdrop of heavy government surveillance used to weaponize information against domestic and foreign enemies. Since then, law enforcement and government intelligence agencies have argued that widespread private use of encryption can hamper criminal investigations and national security efforts. For many cryptographers and privacy activists, however, this surveillance was the primary threat that made widespread use of cryptography a moral necessity in the first place.
One pivotal moment in this history was the prosecution of Phil Zimmerman by the U.S. federal government for distributing his encryption software, Pretty Good Privacy (PGP), without a munitions export license. In the 1990s, U.S. arms control regulators treated cryptography software like a munition, and as a result, exporting such a program overseas by posting it on the internet was in violation of these regulations. The case dragged on for three years before eventually being dropped, and export control laws were rewritten after it became obvious that software couldn’t be contained like rocket motors.
Fast forward to 2021, when a presidential executive order charged the National Institute of Standards and Technology (NIST) with setting standards requiring the encryption of data to protect software supply chains. What’s more, NIST’s Federal Information Processing Standards have included forms of PGP software for more than two decades. This shift illustrates how the federal government eventually recognized the importance of strong encryption and that, despite the different shapes and forms it has taken over the decades, the conflict between the security and privacy advocates reflects a lag in governmental adaptation to technological advancements.
While the overarching debate generally remains the same, the crypto wars now primarily revolve around end-to-end encryption (E2EE). Widespread public access to E2EE means that criminals can also use it with impunity, making it harder to detect and investigate the transmission of illegal content, such as child sexual abuse material and terrorist plots. Law enforcement officials are challenged with investigations going dark because they are unable to conduct investigations due to a lack of access to communications and data. This situation is a surveillance and security problem and a content moderation problem. In addition, E2EE makes it harder for platforms to defend users from spam, abuse, and harassment, and encrypted communications can become vectors for abusive harassment and misinformation campaigns.
Computer scientists and privacy advocates, in turn, counter these criticisms with the argument that strategies for content moderation go beyond merely reading messages, with automated data analysis, user reporting workflows, and message flagging mechanisms as some available tools. Further, with ongoing research in this field, E2EE isn’t necessarily the roadblock to content moderation that many think it is.