Australia Hopes To Strong-Arm Tech Companies Into Giving Up That Precious Encrypted Data

Australia Hopes To Strong-Arm Tech Companies Into Giving Up That Precious Encrypted Data

Last year, Australian Prime Minister Malcolm Turnbull became the subject of ridicule when he insisted his country’s laws would “prevail” in a war with mathematics to ensure law enforcement’s access to encrypted data. Now we know what the anti-encryption law says, and legislators have apparently changed tactics but learned very little.

Like the Department of Justice in the United States, Australian authorities just can’t seem to accept that they sometimes won’t be able to get access to certain data or encrypted devices. Tech companies that use encryption correctly design their products so that even they don’t have access to a locked device or encrypted communication or files.

Still, agencies such as the FBI have insisted that a special backdoor be built into security methods that, they say, only they would be able to access. This is not practical, because it just means leaving a security hole that could be found by anyone else.

But law enforcement uses fear and guilt as a secret weapon, and when in doubt, it’ll always invoke terrorists and paedophiles to get its way. New draft legislation was released by the Australian Department of Home Affairs on Tuesday, and the government insists it won’t compromise security but its rules are necessary to get the bad guys.

“In the last 12 months, 200 cases have arisen where our investigations for serious crimes have been impacted by our inability to access that data under the existing legislation,” Cyber Security Minister Angus Taylor told ABC Australia. “So that means the risk here is that criminals, terrorists, paedophiles and drug smugglers are getting away with their crimes without us being able to hold them to account.”

While its true that criminals (and politicians) can use encrypted messaging to get away with crimes, experts have consistently argued that there’s no such thing as a safe backdoor. It’s a mantra in the cybersecurity community. A backdoor is nothing but a security flaw that needs to be patched, and not doing so would be irresponsible. (Just last year, the NSA showed how not patching backdoors can backfire.)

But Australia thinks it’s found a middle path with its new legislation.

The bill makes various amendments to current laws around search warrants and sets up a three-tiered framework for compelling tech companies to work with law enforcement to recover secure data.

An explanatory document puts the legalese into plain English, but it’s too early to tell if the legislation itself does what lawmakers think it does. Experts and the public will have four weeks to read and comment on it before the bill advances to the next stage of approval.

The most important thing to know is that lawmakers are insisting that “the safeguards and limitations in the Bill will ensure that communications providers cannot be compelled to build systemic weaknesses or vulnerabilities into their products that undermine the security of communications”. That alone would mean that a company could not be compelled to build a backdoor into its products.

The rules apply to foreign and domestic services and manufacturers, so anyone doing business in Australia would be subject to compliance. The three-point plan escalates from asking services for voluntary cooperation to requesting special tools be made for individual cases. Authorities would first need a warrant to request whatever access to data they’re seeking, and a company will be given the option to comply.

If they refuse, a Technical Assistance Notice can be issued that requires a party “to give assistance they are already capable of providing that is reasonable, proportionate, practicable and technically feasible”. Further refusal can result in a fine of up to $10 million for an organisation or $50,000 for an individual.

There are a few things going on here. For the second stage, it appears that the government is placing a bet that some companies secretly have access to their products that they don’t want to reveal to the public.

Additionally, it’s hoping to get easier access to communications that aren’t properly encrypted. The explanatory document cites the statistic that more than “93 per cent of Google’s services and data are encrypted”. Law enforcement wants an easier crack at the other seven per cent.

After speaking Australia’s security minister, ABC outlined another example:

Apple won’t be forced to create a back door for iMessage, where the encryption key is different for every user.

But it does hold a single encryption key for its iCloud services — something the Government could request access to.

We asked Apple for comment on the assertion that it could easily be compelled to turn over data from iCloud if it’s requested but did not receive an immediate response.

The third level is the one that everyone will be watching. Authorities are given the power to issue a Technical Capability Notice that requires “a designated communications provider to build a new capability that will enable them to give assistance as specified in the legislation to ASIO and interception agencies”. It will be up to Australia’s attorney general to decide whether a request is reasonable and technically feasible.

This is an ethically dubious area and one that the law itself could negate. One example of how this would work includes installing malware or other software provided by a government agency onto a specific device. One could imagine an agent engineering a situation in which a suspect needs to replace their iPhone and Apple supplies them with a dirty unit.

Other examples, such as “providing technical information like the design specifications of a device or the characteristics of a service”, could easily loop back around to an argument that the company is being “compelled to build systemic weaknesses or vulnerabilities into their products”.

It’s a borderline philosophical question that will certainly be hashed out in the courts if this bill passes: If you’re aiding the government in its attempts to break your security, does it count as creating a vulnerability?

It’s one thing for an agency to approach a company, ask it to crack a phone, and for the company to do its best to try. It’s another thing to hand over the blueprints in order to aid the government in its quest to find vulnerabilities. Software doesn’t have a moral opinion – anyone trying to find a security hole is a bad actor.

And let’s say Apple finds a way to recover some specific data the government requests by discovering a vulnerability. It could provide that one time request, but it would have an obligation to patch the flaw and start the process back at square one.

There are many other parts of this legislation that the infosec community will take issue with, but if it passes, it could be in tech companies’ best interest to just encrypt the hell out of everything. Apple, Google, Facebook and Microsoft all saw the PR nightmare that comes with secretly working with intelligence agencies back when Edward Snowden revealed their activities with the NSA in 2013.

Implementing end-to-end encryption in every way they possibly would take a lot of the uncomfortable obligations off of them. And there’s still time for the tech giants to implement their significant lobbying muscle to push back on this legislation altogether.

[Reuters, Australian Department of Home Affairs]


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.