Why You Should Care About Apple's Fight With The FBI

Why You Should Care About Apple's Fight With the FBI

The FBI wants Apple's help to investigate a terrorist attack. Apple says providing this help is the real danger. We've reached a boiling point in the battle between tech companies and the government over encryption. And what happens will affect anyone who uses a smartphone, including you. After the San Bernardino shootings, the FBI seized the iPhone used by shooter Syed Rizwan Farook. The FBI has a warrant to search the phone's contents, and because it was Farook's work phone, the FBI also has permission from the shooter's employer, the San Bernardino County Department of Public Health, to search the device. Legally, the FBI can and should search this phone. That's not up for debate. If the FBI gets a warrant to search a house and the people who own it say OK, there's no ambiguity about whether it can search the house.

But if the FBI comes across a safe in that house, the warrant and permission do not mean it can force the company that manufactures the safe to create a special tool for opening its safes, especially a tool that would make other safes completely useless as secure storage. That's the situation that Apple's dealing with here.

The FBI obtained an order from a California district court asking Apple for assistance cracking Farook's passcode. The court order doesn't flat-out demand that Apple unlock the phone, which is an iPhone 5C running iOS 9. Instead, the judge is asking Apple to create a new, custom, terrorist-phone-specific version of its iOS software to help the FBI unlock the phone. Security researcher Dan Guido has a great analysis of why it is technically possible for Apple to comply and create this software. (It would not be if Farook had used an iPhone 6, because Apple created a special security protection called the Secure Enclave for its newer phones that cannot be manipulated by customising iOS.)

The fight isn't over whether Apple can comply in this case. It's whether it should.

If Apple makes this software, it will allow the FBI to bypass security measures, including an auto-delete function that erases the key needed to decrypt data once a passcode is entered incorrectly after ten tries as well as a timed delay after each wrong password guess. Since the FBI wants to use the brute force cracking method — basically, trying every possible password — both of those protections need to go to crack Farook's passcode. (Of course, if he used a shitty password like 1234, the delay wouldn't be as big a problem, since the FBI could quickly guess.)

The security measures that the FBI wants to get around are crucial privacy features on iOS9, because they safeguard your phone against criminals and spies using the brute force attack. So it's not surprising that Apple is opposing the court order. There is more than one person's privacy at stake here!

Apple equates building a new version of iOS with building an encryption backdoor. CEO Tim Cook published a message emphasising that the company can't build a backdoor for one iPhone without screwing over security for the rest:

In today's digital world, the "key" to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that's simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

Apple will be writing its own malware if it complies with this order. It would be creating the best tool to break into its own (older) devices.

"Essentially, the government is asking Apple to create a master key so that it can open a single phone," the Electronic Frontier Foundation wrote in a statement supporting Apple. "And once that master key is created, we're certain that our government will ask for it again and again, for other phones, and turn this power against any software or device that has the audacity to offer strong security."

Don't sit there chuckling if you use an Android, by the way. If Apple is compelled to create this malware, it will affect anyone who uses technology to communicate, to bank, to shop, to do pretty much anything. The legal basis for requesting this assistance is the All Writs Act of 1789, an 18th century law that is becoming a favourite for US government agencies trying to get tech companies to turn over user data. The AWA is not really as obscure as Apple suggests, but it is a very broad statute that allows courts established by Congress to "issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law".

The Department of Justice has even tried to use it to force Apple to turn over suspects' messages before. I know 18th century law sounds boring, but this is an 18th century law that could screw you big time.

The All Writs Act can only force a company to do something if it's not an "undue burden". Seems like making Apple create malware that will fundamentally undermine its core security features is an enormous burden. And if it's not deemed "undue" in this case, that sets a horrible precedent. After all, if compelling Apple to maim itself is allowed, compelling Google and Facebook and Microsoft to write security backdoors would also be allowed.

Image: Getty


Comments

    Just out of curiosity, what if the phone had been jailbroken and wasn't running genuine iOS? would Apple still be able to do this?

      I don't think it'd make a difference. They'd still be undermining the integrity of the Secure Enclave which would affect all users, jailbroken or otherwise. The SE is still just as secure on jailbroken iPhones.

        It's a 5c, it doesn't have Secure Enclave. The security measures are implemented in IOS in this case.

          Wait.... what you said.

          Last edited 18/02/16 5:07 pm

    if it's a work phone why doesn't the company the guy worked for unlock it? I have to do it all the time for my users forgetting their passcodes

      Most likely didn't follow company policy thus preventing the employer's IT staff from unlocking it.

      If the workplace could, mostly it would have been asked already and Apple asked later if the place of employment failed.

    Give the phone to apple, break the security, copy the data and give it back to the FBI!

      That's what I thought as well. The FBI would need a Court Order again for any future use of the software. I suppose, though, once Apple writes the software, someone will try to steal it and/or buy it. As I understand it, Apple's principle is that it should not be forced to make a "backdoor" to it's encryption.

      I wonder instead if the FBI or Apple can clone the phone digital contents and then the FBI run that clone in a simulator. In that case, it wouldn't matter about the time required to do a brute force decryption.

      Defence can say anything that is submitted via that method is not first hand evidence. So apple would need to give police/defence access to ensure everything was provided.

    "Don’t sit there chuckling if you use an Android, by the way." Seriously? Who cares? What do you think you have to hide? You could put everything from my phone up on a billboard in the CBD for all I care. All it would do is bore people to death (and I'm pretty sure my life is a lot more exciting/interesting than yours).

      Point clearly, massively and moronically missed.

      "Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say" -Snowden

      So you think there won't be any consequences when you have an illness and you regularly ring your specialists office to make appointments etc. and your insurance company trolls your phone data and red flags you as a high risk and then leaves you without health insurance. That's the kind of end game we can look forward to if we give away our privacy rights.

      You live up to your reputation yet again gizmodo's resident moron. Have you pulled your head out of your rectum yet? Based on this comment it must be pretty far up there.

    I don't see the issue here. Why can't apple unlock the phone without providing any of the "unlocking" details to the FBI.

    Provide the device unlocked, win win.

      The issue is that the FBI don't want the iPhone in question unlocked, they want to be able to unlock every phone.

        Well the court order is very specific to one handset.

          The court order doesn’t flat-out demand that Apple unlock the phone, which is an iPhone 5C running iOS 9. Instead, the judge is asking Apple to create a new, custom, terrorist-phone-specific version of its iOS software to help the FBI unlock the phone.
          Still specific to one handset...but others would be effected and I'll bet the FBI knows it.

          I just went with what the article said. Didn't read the original court order...

          Last edited 18/02/16 12:26 pm

            Once Apple unlocks the handset with the custom iOS and hands it over to the FBI, there seems nothing to stop the FBI from reverse-engineering the custom iOS for its own future use.

              Is there anything preventing anyone from writing a custom version of iOS and doing what the FBI want done?

              Surely now that they know it can be done, they would be able to do it themselves. After all, figuring out if something can be done is the hardest part.

              Except the court order stipulates the device can remain on Apple premises...
              I'm still undecided on the whole thing to be honest. The point that irks me is that I don't think Apple's stance of "but if we do it, you'll want it done for others" is really a good objection to a specific court order. The further consequences shouldn't be a concern of a single request, that's something for debate/decision by the lawmakers at a higher level.

    See, what we're dealing here with folks is what we in the Digital Media Sales game euphemistically call "managing client expectations" (client here being the effa-bee-eye). Apple COULD do what the feds ask - but if you bend over and drop your pants even once for a 'special case', you're setting up the 'expectation' that you can do it again in future, and the 'client' will (naturally) push for that next time they're in a jam. As we all know, you give the feds a hand and before you know it they'll take both arms, one of your lungs, and the blood of your firstborn. That's why Apple can't give in to this, regardless of the individual 'merits' of this case.

    George Branding is a f*cking idiot too weighing in with his opinion, but we wouldn't expect anything less from Mr. "I-don't-understand-metadata-but-I-want-to-collect-it" ...

    Last edited 22/02/16 1:49 pm

    Seriously Apple just get the data off and give it to the FBI. anyone who thinks apple have to make a master key or special OS are fooling themselves. Within a few minutes the iOS development team could have that data. Apple might be rich but it doesn't make them above the law.

Join the discussion!

Trending Stories Right Now