Google has faced a lot of backlash since Gizmodo revealed in March that the company was helping the United States Department of Defence develop artificial intelligence that could be used to analyse drone footage as part of the Defence Department’s Project Maven. Internally, the program angered many Google employees and members of the AI community. Google ultimately decided against renewing its contract for Project Maven and released a set of principles to guide its use of AI.
Now, Google is making sure the world knows it is committed to finding ways to use AI to benefit humanity through it’s explicitly named effort—AI for Social Good. Google announced on Monday that under the new initiative, the company has pledged to give away $US25 ($35) million in grants for humanitarian AI projects that help with matters like hindering human trafficking, health care, disaster relief, and environmental conservation. The contest, AI Impact Challenge, is soliciting applications from non-profits institutions and social-mission-driven for-profit companies until January 22.
CNET reported that at the announcement event at the Google office in Sunnyvale, California, the company’s head of AI Jeff Dean did not mention any of the recent controversy over Google developing AI for the Pentagon. At a press conference after the event, Dean reportedly said the program is “not really a reaction” to recent news.
A Google spokesperson also told Gizmodo the AI for Social Good endeavour is not a response to Project Maven.
At Monday’s event, Google mentioned some new humanitarian AI endeavour—like a National Oceanic and Atmospheric Administration collaboration to create AI that listens for sounds of whales so shipping companies can avoid endangered species.
“We’re all grappling with questions of how AI should be used,” Dean said at the even, according to CNET. “AI truly has the potential to improve people’s lives.”
Amidst this new focus on ways technology can be used for good, hopefully some Google executives are also reflecting on the myriad of ways their technology can be used for nefarious purposes.