In an open letter to customers Tuesday, Apple said it will not unlock the iPhone of one of the San Bernardino shooters, as ordered by a federal judge. On Tuesday the judge said the tech giant must give investigators access to the encrypted information on the phone; Apple had "declined to provide voluntarily" the requested technical assistance. Tim Cook, who penned the letter, made it crystal-clear where the company stands on the matter. He said the judge's order has legal implications far beyond the San Bernardino investigation and that unlocking an iPhone would threaten the security of customers.
He explains in the letter what Apple sees as the extreme broader implications:
The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals.
The iPhone 5C in question was used by Syed Rizwan Farook but belongs to the San Bernardino County Department of Public Health, his former employer and the main target of the attacks. The county has consented to requests by investigators to search the phone's data. The problem lies in the passcode that must be entered to unlock the device, which the FBI does not know.
Investigators say they are in possession of iCloud data (which is backed up to Apple's servers) but it stops about a month before the shooting. If Farook turned off the cloud back-ups on purpose, there could be information stored locally on the device that's of interest. The only way to know would be to get past the unlock screen and into the device. A four-digit passcode is the key.
In the court order, based on a law passed more than 100 years ago, Apple was instructed to help the FBI unlock the device. Normally, after 10 unsuccessful tries trying to unlock an iPhone, all the personal data is wiped clean. That stops you from guessing and guessing until finally cracking the password.
U.S. Magistrate Judge Sheri Pym ordered Apple to disable that auto-wipe feature and allow the FBI to use a computer to guess the passcode with a "brute force" attack — the computer's processor would guess all the multiple combinations, which is much faster than entering the possible combinations manually.
This is why Tim Cook calls the request "dangerous." If Apple creates an application that would allow a brute force attack on the passcode, it would make all the encryption protection pointless because any hacker — whether it's a government or a criminal — could access personal data using the same application, or "backdoor." Cook says that while the government implies it would only be used this one time, it could later be applied to any iPhone on the planet:
In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
What exactly investigators hope to find on Farook's iPhone is unclear. He and his wife, the other shooting suspect in the attacks, physically destroyed their two personal cellphones. The FBI has been unable to recover data. The couple also removed a hard drive from their computer, which has not been found by authorities. Farook's work phone, the iPhone in question, was found in a Lexus owned by his family. It's not clear whether he forgot about it or didn't care about what was on it.
Ultimately, Apple sees this as bigger than just the San Bernardino case. Cook wrote that he believes the FBI's intentions are good, but that a backdoor to Apple products is a bad idea. Cook finished his letter laying it all out there: "We fear that this demand would undermine the very freedoms and liberty our government is meant to protect."
What ultimately happens will be up to the courts, but given the public stance Apple has taken and its positive reaction from customers, you can bet Apple won't back down easily.