NYPD Admits The Battle Over Apple Will Set A Crucial Precedent 

NYPD Admits the Battle Over Apple Will Set a Crucial Precedent

Apple is still fighting with the US government over whether it should create a special software to help the DOJ unlock an iPhone connected to the suspect in the San Bernardino shooting. But government officials and Apple execs agree about one key point: It's not about one phone. This is about the future of security. In an op-ed for the New York Times today, New York Police Department Commissioner William Bratton and NYPD Intelligence and Counterterrorism Deputy Commissioner James J. Miller admitted that what Apple has been asked to do will drive how the government demands tech companies provide access to secured devices in the future.

"The ramifications of this fight extend beyond San Bernardino," Bratton and Miller write. The NYPD bosses say that the government's demand boils down to restoring "a key that was available until 2014". This is a reference to the change Apple made in 2014, when it upgraded its encryption.

While it's easy to imagine that most government officials wish that they could time travel and convince Apple not to upgrade its security, or that Apple could somehow remotely downgrade all of its phones to iOS 7 and other software versions with weaker encryption, that's a misrepresentation of what the DOJ is asking. It is asking that Apple create a software to work around security measures in place to protect encrypted data — and it is asking this in order to set a precedent for cooperation, not just for this one wild and rare incident.

What Apple is pushing back against is the idea that the government can ask it for something that it doesn't have, and compel the company to create a security bypass for its software. If the DOJ can do that, Apple argues, it sets a precedent for compelling tech companies to weaken their own security.

It definitely won't stop with one phone. The US Justice Department is already seeking court orders for at least twelve other iPhones, according to a report from the Wall Street Journal.

This is a fight about how strong our devices' security protections should be. Apple's refusal to create security-weakening software in the San Bernardino case doesn't mean the government has no other ways to break the security on the device. It certainly makes it more difficult, just as it's harder for police to knock down a locked door than it is to get the lock manufacturer to develop a special key to open it. Bratton's op-ed makes it clear that the US government prefers device security that's easier for it to crack — he's unapologetic about wanting a "front door" entrance. (The op-ed's title: Seeking iPhone Data, Through the Front Door.)

What it doesn't do is offer any compelling reasoning why people should embrace shitty phone security as a trade-off for making the execution of search warrants easier. Nobody wants to see terrorists get away, but framing this as a choice between terrorists winning and normal people having secure phones is a slimy move. That's not the choice. The government is attempting to conscript third-party security providers to destroy their own services as a shortcut to executing a search warrant. The choice is whether that conscription is fair and legal — or an undue burden that will undermine security.

"Google, which owns the Android system, now indicates that it will follow Apple's lead," Bratton and Miller write. "For those companies, and others like them, there is a sound argument in not wanting, even indirectly, to become an arm of the government. But when you are the two companies whose operating systems handle more than 90 per cent of mobile communications worldwide, you should be accountable for more than just sales."

Apple, Google and other companies are certainly accountable for more than just sales. But arguing that these companies have a responsibility to perform what amounts to police work — that they should break protections they have created for customers at the whim of the government — is absurd.


Comments

    90% world wide. The US government doesn't get to piss around with my phone.

    They certainly shouldn't have access to any keys with the level of access hackers have to their databases. If you think they're a target now, wait until they have something really worth stealing!

    This is as shitty as the domestic law passed which allows them to hack any computer globally, because national security.

    What's everyone worrying about? I bet they can keep it as secure as the TSA skeleton keys! Oh... Wait...

    whats all the hooha about?
    how does writing software to unlock 1 iphone compromise all iphones?
    Are they telling apple they need to share the code with everyone?
    If the FBI needs u to unlock a phone for an investigation you do it. Only apple are this arrogant.

      Neither Microsoft nor Google would readily comply with a request like this either, they've just not been asked yet. Deliberately weakening their customers security, when their competitors have no similar legal requirement would be business suicide. Judges can be wrong, that's why they have the appeals process, and above that the Supreme court. There's no way they're going to comply until they've exhausted all their avenues of appeal, and they'd be fools if they did, and it would be foolish of anyone to expect otherwise.

    Because all iPhones have the same software. That means if Apple writes software for one phone, they would have to update for all phones because they are all the same phone. They use the same update cycle. Just because the FBI says they need this one phone to be unlocked, doesn't mean they don't want this backdoor to be on all iPhones. Why would they put this much effort into getting Apple into unlocking this ONE phone which takes up huge amounts of time and money? It would make more sense if they were trying to get a backdoor on all iPhones. And this would be a breach of security and allow hackers to easily steal data from people's iPhones and hack into these iPhones.

Join the discussion!

Trending Stories Right Now