On February 23, New York City police commissioner William Bratton and deputy commissioner for counterterrorism and intelligence John Miller published an op-ed in The New York Times. Using hasty logic, Bratton and Miller backed the government’s mandate that Apple act against its will and the best interests of its customers to deliberately compromise the security of one of its iPhones. Their piece is short-sighted and ignores recent history.
First, Bratton and Miller rely on the misguided assumption that just because similar software designed to compromise earlier versions of the iPhone was never stolen, future iterations of the software could be kept safe. Of course, we only know that we aren’t aware of any previous thefts, not that they didn’t occur. An old adage in our sector bears repeating: there are only two kinds of companies in the world, those that have been hacked and those that don’t know it.
Implying that past safety correlates to future safety is like a new driver saying he won’t wreck the car because he hasn’t wrecked one yet. We know such conclusions are dangerous.
Those who support Apple, including tech companies of every size and type, believe that complying with the order will create a treacherous precedent. The precedent would be a crutch that repressive regimes around the world could lean on when they set out to compel developers to help them curb security, privacy, and free speech in their own countries. Somehow, Bratton and Miller believe companies can simply “insulate” themselves by forcing a country like China to go through the U.S. State Department to access data on American-made products. A more likely outcome would be China erecting trade barriers to keep American innovators out if they refuse to comply with their data demands.
The worldview Bratton and Miller paint is nice, but it’s also a naive approach to international affairs and human rights.
Bratton and Miller point out that there is a 200-year precedent for search warrants, and I’m happy to concede that point. But they fail to acknowledge that search warrants conducted 200 years ago were done without threatening the privacy and security of tens of millions of other Americans in the process. A door broken down in a search conducted in 19th century Virginia did not simultaneously break down a door in Pennsylvania. Yet today, breaking down the “door” in one phone could simultaneously break down the doors in millions of other phones, giving thieves, hackers, foreign governments, terrorists, and other bad actors access to our most private financial, health, business, and personal information.
Finally, supporters of the government’s position tell us that the new undermining technology would be limited to very narrow circumstances. Unfortunately, we’ve heard the same tired song from the government before. The Patriot Act included a “sneak-and-peek” provision for surveillance, which the government pledged to only use to track suspected terrorists. At one time that may have been true, but in 2013, the provision was used more than 11,000 times, mostly for narcotics cases. Of the 11,129 cases that year, only 51 searches - 0.5 percent of the total - related to terrorism! At best, law enforcement’s track record on using newly-granted authority with discretion is less than stellar.
We live in a complicated world, and the authors would do well to acknowledge that. Data privacy is a complicated matter, but Bratton and Miller must be more reflective on what is and isn’t in the nation’s best interest before playing loose with the facts. There is no way to guarantee that a mandated “master key” doesn’t fall into the wrong hands, that foreign nations make requests to access such a key through proper channels, or that once created, the key’s use wouldn’t spiral out of control. Combating terrorism and protecting Americans is a priority for tech companies, and Apple is right to refuse to build a tool that will undermine those objectives.