The FBI wants Apple to help hack into the iPhone used by one of the terrorists in the San Bernardino, Calif., attack last December. The feds seek information such as the contact list and text messages that are locked behind a password in the phone. That information could help in their investigation of Syed Rizwan Farook.
At first glance, a no-brainer, right? What company wouldn’t want to grab a starring role in “To Catch A Terrorist”?
But Apple chief Tim Cook refuses. He says the government’s request would force Apple to build a “back door” key to its operating system that could then be exploited by criminals and hackers around the world. He argues in effect that the damage to users’ privacy — all the sensitive personal information on their phones — far outweighs the benefit to law enforcement in this case.
Earlier this week, a federal magistrate ordered Apple to cooperate with the FBI. The company says it will appeal.
You’ll be hearing a lot more about this as it moves up the legal food chain toward the Supreme Court. Everyone with a computer or a smartphone has a stake in how this plays out.
Obvious Question: Is Apple siding with terrorists?
Obvious Answer: Not at all.
Farook had backed up his data to the cloud until about six weeks before the attack. Apple has turned over that data to the FBI.
But there’s still more information on his phone. The FBI wants Apple to help disable a password-protection feature that would wipe the phone’s data after someone makes 10 unsuccessful password attempts. The feds want the ability to make tens of millions of guesses at Farook’s password, presumably guessing right eventually.
Apple claims that to comply with the FBI, the company would need to create a sweeping software program that could be used on other devices, imperiling other users’ security. The feds, in contrast, say they seek a narrowly tailored tool to unlock a single phone.
We won’t wander into those deep electronic weeds. Suffice to say, we imagine judges will be hearing plenty of evidence from computerati about why Apple’s technical argument is right … or wrong.
But let’s not lose sight of the larger issue here. Apple and other tech companies have created highly secure systems for a compelling reason: to thwart hacking attempts by criminals, terrorists and saboteurs who continually probe for weaknesses in computer systems.
We’re glad that on the newer iPhones, even Apple can’t break into its own system: If there’s a back door created for law enforcement, you can be sure that someone will find the key under the mat or the flowerpot.
Cyberwar is a mounting threat. The US government is defending against the possible electronic sabotage of this country’s power grid and transportation systems. Hackers routinely ransack banks, retail chains and other firms, stealing credit card data, personnel files, trade secrets.
No company would relish casting itself in the role of denying a government request to help thwart terrorists. It’s not hard to imagine how some customers won’t understand, won’t agree and will shun Apple products.
But customers depend on companies like Apple to stay one step ahead of hackers. If Apple writes software to defeat its sophisticated encryption systems, its customers will think twice. And its competitors — many of whom develop their apps and systems in foreign countries beyond the reach of US law — will leverage that information to create ever more bulletproof systems. Why bother? To gain a competitive advantage against Apple.
“We have great respect for the professionals at the FBI and we believe their intentions are good,” Cook wrote in an open letter. So do we.
That said, we also believe Apple is well-equipped to judge the dangers of creating this sort of vulnerability in its security systems. Unless a tech genius devises a brilliant alternative to resolve this conflict without risking that security, Apple should stand its ground.
©2016 the Chicago Tribune / Distributed by Tribune Content Agency, LLC.