Google Backtracks, Says Its AI Will Not Be Used For Weapons Or Surveillance

Google Backtracks, Says Its AI Will Not Be Used For Weapons Or Surveillance

Google is committing to not using artificial intelligence for weapons or surveillance after employees protested the company’s involvement in Project Maven, a Pentagon pilot program that uses artificial intelligence to analyse drone footage. However, Google says it will continue to work with the United States military on cybersecurity, search and rescue, and other non-offensive projects.

Photo: Michele Tantussi (Getty Images)

Google CEO Sundar Pichai announced the change in a set of AI principles released today. The principles are intended to govern Google’s use of artificial intelligence and are a response to employee pressure on the company to create guidelines for its use of AI.

Employees at the company have spent months protesting Google’s involvement in Project Maven, sending a letter to Pichai demanding that Google terminate its contract with the US Department of Defence. Several employees even resigned in protest, concerned that Google was aiding the development of autonomous weapons systems.

Google will focus on creating “socially beneficial” AI, Pichai said, and avoid projects that cause “overall harm”.

“How AI is developed and used will have a significant impact on society for many years to come,” Pichai wrote. “These are not theoretical concepts; they are concrete standards that will actively govern our research and product development and will impact our business decisions.”

The AI principles represent a reversal for Google, which initially defended its involvement in Project Maven, noting that the project relied on open-source software that was not being used for explicitly offensive purposes.

The principles were met with mixed reactions among Google employees. Despite Google’s commitment not to use AI to build weapons, employees questioned whether the principles would explicitly prohibit Google from pursuing a government contract like Maven in the future.

One Googler told Gizmodo that the principles amounted to “a hollow PR statement”. Several employees said that they did not think the principles went far enough to hold Google accountable – for instance, Google’s AI guidelines include a nod to following “principles of international law” but do not explicitly commit to following international human rights law.

“While Google’s statement rejects building AI systems for information gathering and surveillance that violates internationally accepted norms, we are concerned about this qualification,” said Peter Asaro, a professor at The New School and one of the authors of an open letter that calls on Google to cancel its Maven contract.

“The international norms surrounding espionage, cyberoperations, mass information surveillance, and even drone surveillance are all contested and debated in the international sphere. Even Project Maven, being tied to drone surveillance and potentially to targeted killing operations, raises many issues that should have caused Google to reject it, depending on how one interprets this qualification.”

Another Googler who spoke with Gizmodo said that the principles were a good start, mitigating some of the risks that employees who protested Maven were concerned about. However, the AI principles do not make clear whether Google would be precluded from working on a project like Maven – which promised vast surveillance capabilities to the military but stopped short of enabling algorithmic drone strikes.

Google Cloud CEO Diane Greene defended her organisation’s involvement in Project Maven, suggesting that it did not have a lethal impact. “This contract involved drone video footage and low-res object identification using AI, saving lives was the overarching intent,” Greene wrote in a blog post.

“We will not be pursuing follow on contracts for the Maven project, and because of that, we are now working with our customer to responsibly fulfil our obligations in a way that works long-term for them and is also consistent with our AI principles,” she added, confirming Gizmodo‘s reporting last week that Google would not seek to renew its Maven contract after it expires in 2019.

“On most fronts, these are well thought-out principles, and with a few caveats we’d recommend that other major tech companies set out similar guidelines and objectives for their AI work,” Peter Eckersley, chief computer scientist at the Electronic Frontier Foundation, told Gizmodo.

To improve upon its principles, Google should commit to independent and transparent review to ensure that its rules are properly applied, he said. Pichai’s assertions about not using AI for surveillance also left something to be desired, Eckersley added.

“The company has constrained itself to only assisting AI surveillance projects that don’t violate internationally accepted norms,” Eckersley said. “It might be more comforting if Google tried to avoid building AI-assisted surveillance systems altogether.”

In internal emails reviewed by Gizmodo, a Google employee working on Project Maven said that the company would attempt to provide a “Google-earth-like” surveillance system, offering “an exquisite capability” for near real-time analysis of drone footage.

Academics and students in the fields of computer science and artificial intelligence joined Google employees in voicing concerns about Project Maven, arguing that Google was unethically paving the way for the creation of fully autonomous weapons.

Asaro praised Google’s ethical principles for their commitment to building socially beneficial AI, avoiding bias, and building in privacy and accountability. However, Google could improve by adding more public transparency and working with the United Nations to reject autonomous weapons, he said.

The internal and external protests put Google in a difficult position as it aims to recenter its business around the development and use of artificial intelligence. Although its contract with the US Defence Department for Maven was relatively small, Google considered its Maven work as an essential step in the process to winning more lucrative military contracts.

Google likely planned to bid on JEDI, a cloud computing contract with the US Defence Department that could be worth as much as $US10 billion ($13 billion). It’s unclear now whether bidding on the JEDI contract would amount to a violation of Google’s newly announced principles – or whether the Pentagon would consider partnering with Google again after the company backed away from Maven.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.