In April, not long after Apple refused to unlock the iPhone of one of the San Bernardino shooters, a variety of U.S. law enforcement representatives and officials, including the FBI, announced their support of an encryption bill that would require companies to “comply with court orders to protect Americans from criminals and terrorists.” In short, to encode in all the software running on cell phones or personal computers a backdoor, or a “golden key,” for the authorities. Although the FBI’s legislative initiative failed earlier this year, a new effort to expose Americans’ data is under way in Congress.
Developing such keys is unproductive and dangerous. Although the primary worry in the United States is that it could lead to mass surveillance, the lesser-known hazard is that it could jeopardize the safety of human rights activists, primarily those based abroad who rely on U.S. encryption tools to do their work.
There is no guarantee that a golden key will work, given the facility in which rogue hackers from all over the world can develop their own encryption tools. In February, a study by the Berkman Klein Center for Internet & Society at Harvard University focused on some 865 encryption products from 55 countries, two-thirds of which were built outside of the United States. Of these, roughly two-thirds are commercial and the others are open source, even though some of the free products are only libraries that contain building blocks rather than whole encryption systems. Given the resources available to ill-intentioned hackers, it would therefore be impossible to stop them from building strong encryption applications of their own.
Instead, building encryption software with a golden key for government access would gravely compromise security for law-abiding citizens around the world, as it would encourage criminals and terrorists to build their own illegal software to frustrate the authorities and leave those without the technological skills—most of the world—more vulnerable to attack.