Imagine this: Australia’s police and intelligence agencies, acting on multiple anonymous tips, have identified a plot to set off a bomb in the CBD sometime in the next 24 hours.
Now further imagine, at a midnight hearing, they request that a magistrate grant them a warrant to intercept the suspects’ smartphone calls and texts. Without hesitation, the judge gives them approval.
Access to non-encrypted communications could prove vital to preventing devastation.
Despite this, it’s possible they will hear and see nothing. Perhaps the suspects are using Apple’s iPhones to communicate with each other, making use of its built-in encrypted messaging app iMessage and another encrypted app, Wickr.
In this made-up scenario, officials immediately ask Apple and Wickr for assistance to access the messages. Apple might refuse, citing its inability to do so without systemically weakening its systems and undermining encryption for millions. Wickr might provide the same reason, meaning the encrypted messages would remain impenetrable.
Without the ability to monitor communications, officials might not have sufficient evidence to execute a search warrant or lock up the suspects. Even if they pounced on the conspirators, they wouldn’t be confident that they could gain access to the suspects’ confiscated phones due to passcodes they know they can’t bypass. They might have to release them, tipping them off to the fact that they’re being watched.
The authorities would have one less avenue to find out when or where the bomb was going to go off, or how to stop it.
This is all a hypothetical scenario, but it’s not too dissimilar to ones that play out, albeit at varying levels of severity, in the course of Australian law enforcement agencies’ protection of national security, busting of drug syndicates, the unearthing of paedophile networks, and more.
Apps like encrypted messaging service Wickr has proven tough for law-enforcement to crack.
Already, about 95 per cent of ASIO’s most dangerous counter-terrorism targets actively use encrypted messages to conceal their communications, Australia’s recently departed cyber security minister Angus Taylor said in June. But it’s not just terrorists.
“Few issues have vexed law-enforcement … more than this one,” Mr Taylor said then of encryption.
“They can’t get access to the data they need to stop crime and hold criminals to account.”
Thus one can picture the dilemma technology companies and governments now find themselves in at a very basic level: continue to allow tech companies to protect users’ privacy and security no matter what and in a worst-case scenario let people die … or reduce that risk but curtail users’ privacy in doing so.
Now, I’m not saying Apple and others won’t or haven’t lifted a finger to save the lives of your loved ones; they have. In some instances, for example, terrorists practise poor operational security by uploading a back-up of their phone to the cloud, which tech companies often receive with users’ private encryption keys intact, so that users can rebuild their phones again in case they’re stolen. In that case, Apple can (and does) hand over a lot of data to law enforcement.
But not everybody thinks this is enough.
As Philip Ruddock, a former Liberal attorney-general, countered in a 2015 parliamentary inquiry concerning Australia’s mandatory metadata retention laws: “I would put it to you that the right to life, which is a human right, is of fundamentally greater importance than the right to privacy.”
And so it was with a similar sentiment that Australia unveiled its own proposed legislation for consultation in mid-August to counter encryption and the problem it’s presenting agencies.
Apple’s iPhone has proven impenetrable for law-enforcement.
There’s no doubt about it: this bill will undermine encryption. To understand this, you need look no further than it requiring companies to participate in “removing one or more forms of electronic protection”. And while it states it won’t cause systemic weaknesses, tech giants aren’t buying this.
“The reality is that creating security vulnerabilities, even if they are built to combat crime, leaves us all open to attack from criminals,” says Nicole Buskiewicz, head of the non-profit Digital Industry Group, which represents Facebook, Google, Twitter, Yahoo and other tech companies in Australia.
“This could have devastating implications for individuals, businesses, public safety and the broader economy,” Buskiewicz continues. “We are extremely concerned at the lack of judicial oversight and checks and balances with this legislation.”
When reading this legislation, the “greater good” and whether it is being served must be considered. As part of that, we must also consider whether its risks outweigh its benefits.
Telcos and tech giants such as Facebook and Google are facing pressure to hand over information on suspected criminals as the government unveils new data laws.
Where this bill fails, however, is in its lack of appropriate checks and balances and its potential for scope creep. If the government is to claim it is to protect us from terrorists, illicit drugs, and paedophiles, for instance, then why do we also see a requirement in it for technology companies to assist with access to encrypted communications when “protecting the public revenue”?
The question now is whether 1. Australians are willing to trade their privacy and data security for the security of the nation, in addition to protecting us from crimes not considered pressing national security concerns and 2. Whether the bill’s checks, balances and scope are appropriate.
Peter Coroneos, the author of the Cyber Breach Communication Playbook and former head of the Internet Industry Association, believes most Australians are willing to trade some privacy for national security. But he says there’ll likely be unintended consequences as a result of the passage of the 176-page bill in its current form, which he describes as “complex”.
“Something needs to be done but it probably isn’t this bill in its current form,” Mr Coroneos says. “There are going to have to be some amendments made to make it more socially palatable.
“If you read between the lines of the legislation, you see the undeniable tension that government faces within its own ranks at trying to balance the interests of those who understand the need for encryption versus the law enforcement and security community, who are intensely frustrated with the roadblocks put in front of a lot of their investigative work.”
Lawyer Patrick Fair of Baker McKenzie, who advises members of the communications industry, says the legislation “needs more work”, adding that he too would like to see more judicial oversight.
Those who argue nothing should be done, that tech companies should continue to protect communications in all instances, will not win this debate. They’ve lost previous debates on similar legislation. Instead, they should focus their efforts on limiting the bill’s scope, increasing oversight, transparency and reporting mechanisms, and ensuring the powers that be don’t abuse the trust we place in them, which they often do without appropriate checks and balances.