The future of artificial intelligence and war sounds dizzyingly high-tech, but that’s not always the case.
The truth, for at least one multimillion-dollar San Francisco startup and its customers Google and the U.S. military, is that building cutting-edge AI apparently involves a group of old-fashioned human beings grinding through mountains of data, often with no idea what they’re working on.
Google hired gig workers from a “human-in-the-loop” machine learning company called Figure Eight in order to help develop the company’s controversial Project Maven initiative, a project with the U.S. Defence Department to develop an artificial intelligence program to analyse drone footage, according to a Monday report from the Intercept.
Those workers had no idea they were working for either Google or the U.S. military, Figure Eight account executive Will Pleskow reportedly told the Intercept. Gig workers on this kind of crowdsourcing platform can earn as little as $1 per hour for “micro-tasks” as part of a model called “human-in-the-loop” designed to have humans fill in on tasks where computers typically fail.
Project Maven first came to light last year when Gizmodo reported Google’s partnership with the U.S. Department of Defence to build an AI program to analyse drone footage. Google decided against renewing the contract after public outcry and a firestorm among employees first learning of the project.
Thousands of Google employees signed a petition demanding Google cancel the contract, and dozens of employees resigned from the firm.
Although Google officials defended the company’s involvement on Project Maven and characterised the work as minor, emails reviewed by Gizmodo showed Google executives saw a huge opportunity for growth in the possibility of lucrative business with the Pentagon and projects that could ultimately lead to a cutting-edge AI-powered system capable of surveilling entire cities.
The fact that Google took the rare step of backing off Pentagon contracts worth, executives hoped, potentially up to $346 million due in large part to employee revolt makes it even more striking that the company had an unspecified number of gig workers tasked with building the drone AI program who did not and could not know what they were building. That secrecy is by design.
The Google employees who pushed back against Project Maven are typically the traditional and relatively wealthy tech workers associated with Silicon Valley. It’s the generally poorer and less-supported gig workers situated around the world—tech workers in their own right—who have little recourse to know or influence the Silicon Valley giants they indirectly work for.
“Contributors to the Figure Eight platform are not given who the data will benefit,” one former crowd worker told the Intercept. “Usually, they are given a reason for why they are doing a task, like, ‘Draw boxes around a certain product to help machines recognise it,’ but they are not given the company that receives the data.”
According to Figure Eight’s Pleskow, the people who do “micro-tasks” for the company rarely known what kind of program they are training.
“Human-in-the-loop (HITL) is a branch of artificial intelligence that leverages both human and machine intelligence to create machine learning models,” Figure Eight’s marketing material explains. “In a traditional human-in-the-loop approach, people are involved in a virtuous circle where they train, tune, and test a particular algorithm.”
Meredith Whittaker, a researcher at NYU’s AI Now Institute, said the story illustrated “core issues” about the future of AI.
Illustrating core issues anyone serious abut "AI bias" needs to contend w/, including the precarious "shadow workforces" required to create AI & the practice of making workers contribute to military [/other oppressive] tech w/o their knowledge or consent https://t.co/F1Z7KlLrlZ
— Meredith Whittaker (@mer__edith) February 4, 2019
Following up the very public turbulence over the company’s involvement with the Pentagon, Google released an updated set of artificial intelligence ethical principles that includes a promise to “be accountable to people.” When asked whether that principle would impact this kind of blind work in the future, the company did not respond.
Google’s Project Maven contract ends sometime in 2019. The company did not respond to Gizmodo’s request for comment or questions about the contract’s exact timeline. Figure Eight also did not respond to questions.