Court Order to Apple to Unlock San Bernardino iPhone May Unlock Hackers

A judge’s order that Apple cooperate with federal authorities in the San Bernardino bombing investigation may have serious unintended consequences. There are no easy answers. Once more, a broad dialog is required.

Scales of JusticePreviously I opined about how a dialog should occur between policy makers and the technical community over encryption.  The debate has moved on.  Now, the New York Times reports that federal magistrate judge Sheri Pym has ordered Apple to facilitate access to the iPhone of Syed Rizwan Farook, one of the San Bernardino bombers.  The Electronic Frontier Foundation is joining Apple in fight against the order.

The San Bernardino fight raises both technical and policy questions.

Can Apple retrieve data off the phone?

Apparently not.  According to the order, Apple is required to install an operating system that would allow FBI technicians to make as many password attempts as they can without the device delaying them or otherwise deleting any information.  iPhones have the capability of deleting all personal information after a certain number of authentication failures.

You may ask: why doesn’t the judge just order Apple to create an operating system that doesn’t require a password?  According to Apple,  the password used to access the device is itself a key encrypting key (KEK) that is used to gain access to decrypt the key that itself then decrypts stored information.  Thus, bypassing the password check doesn’t get you any of the data.  Thus, the FBI needs the password.

What Apple can do is install a new operating system without the permission of the owner.  There are good reasons for them to have this ability.  For one, it is possible that a previous installation failed or that the copy of the operating system stored on a phone has been corrupted in some way.  If technicians couldn’t install a new version, then the phone itself would become useless.  This actually happened to me, personally, as it happens.

The FBI can’t build such a version of the operating system on their own.  As is best practice, iPhones validate that all operating systems are properly digitally signed by Apple.  Only Apple has the keys necessary to sign imagines.

With a new version of software on the iPhone 5c, FBI technicians would be able to effect a brute force attack, trying all passwords, until they found the right one.  This won’t be effective on later model iPhones because their hardware slows down queries, as detailed in this blog.

Would such a capability amount to malware?

Kevin S. Bankston, director of New Americas Open Technology Institute has claimed that the court is asking Apple to create malware for  the FBI to use on Mr. Farook’s device.  There’s no single clean definition of malware, but a good test as to whether the O/S the FBI is asking for is in fact malware is this: if this special copy of the O/S leaked from the FBI, could “bad guys” (for some value of “bad guys”) also use the software against the “good guys” (for some value of “good guys”)?  Apple has the ability to write into the O/S a check to determine the serial number of the device.  It would not be possible for bad guys to modify that number without invalidating the signature the phone would check before loading.  Thus, by this definition, the software would not amount to malware.  But I wouldn’t call it goodware, either.

Is a back door capability desirable?

Unfortunately, here there are no easy answers, but trade-offs.  On the one hand, one must agree that the FBI’s investigation is impeded by the lack of access to Mr. Farook’s iPhone, and as other articles show, this case is neither the first, nor will it be the last, of its kind.  As a result, agents may not be able to trace leads to other possible co-conspirators.  A  Berkman Center study claims that law enforcement has sufficient access to metadata to determine those links, and there’s some reason to believe that.  When someone sends an email, email servers between the sender and recipient keep a log that a message was sent from one person to another.  A record of phone calls is kept by the phone company.  But does Apple keep a record of FaceTime calls?  Why would they if it meant a constant administrative burden, not to mention additional liability and embarrassment, when (not if) they suffer a breach?  More to the point, having access to the content on the phone provides investigators clues as to what metadata to look for, based on what applications were installed and used on the phone.

If Apple had the capability to access Mr. Farook’s iPhone, the question would then turn to how it would be overseen.  The rules about how companies  handle customer data vary from one jurisdiction to another.  In Europe, the Data Privacy Directive is quite explicit, for instance.  The rules are looser in the United States.  Many are worried that if U.S. authorities have access to data, so will other countries, such as China or Russia.  Those worries are not unfounded: a technical capability knows nothing of politics.  Businesses fear that if they accede to U.S. demands, they must also accede to others if they wish to sell products and services in those countries.  This means that there’s billions of dollars at stake.  Worse, other countries may demand more intrusive mechanisms.  As bad as that is, and it’s very bad, there is worse.

The Scary Part

If governments start ordering Apple to insert or create malware, what other technology will also come under these rules?  It is plain as day that any rules that apply to Apple iPhones would also apply to Android-based cell phones.  But what about other devices, such as  televisions?  How about  Refrigerators?  Cars?  Home security systems?  Baby monitoring devices?  Children’s Toys?  And this is where it gets really scary.  Apple has one of the most competent security organizations in the world.  They probably understand device protection better than most government clandestine agencies.  The same cannot be said for other device manufacturers.  If governments require these other manufacturers to provide back door access to them, it would be tantamount to handing the keys to all our home to criminals.

To limit this sort of damage, there needs to be a broad consensus as to what sorts of devices governments should be able to access, under what circumstances that access should happen, and how that access will be overseen to avert abuse.  This is not an easy conversation.  That’s the conversation Apple CEO Tim Cook is seeking.  I agree.

Leave a Reply

Your email address will not be published. Required fields are marked *