Government Watchdog Finds Most U.S. Agencies Don’t Even Know Which Face Recognition Systems They Use

Government Watchdog Finds Most U.S. Agencies Don’t Even Know Which Face Recognition Systems They Use

Despite the fact that more than a dozen federal agencies regularly use facial recognition tech, there’s hardly any oversight into the systems these agencies use. That’s according to a new report out Tuesday from the U.S. Government Accountability Office (GAO), a federal watchdog agency that offers congresspeople investigative support.

Twenty different U.S. agencies — from ICE and the FBI to the Department of Veterans Affairs and IRS — were found using facial recognition, according to the report. Of the group, seventeen of the agencies at least partially relied on tech from a private company, like Vigilant Solutions or Clearview AI. But barely did any of the agencies know for certain which private-owned systems employees were using.

“Thirteen federal agencies do not have awareness of what non-federal systems with facial recognition technology are used by employees,” the report reads. “These agencies have therefore not fully assessed the potential risks of using these systems, such as risks related to privacy and accuracy.”

And there’s a lot of those risks on the table. There have been multiple studies showing that facial recognition systems are generally worse at identifying Black and Brown faces than those belonging to their White counterparts. Despite the fact that the government has known about this specific shortcoming for years, we’ve seen countless law enforcement agencies — not to mention federal outfits like CBP — turn to the tech to apprehend people, sometimes with disastrous results.

Back in April, a Detroit man named Robert Williams sued his local police after their agency’s facial recognition tech wrongly identified him as a runaway shoplifter, and arrested him. The Williams case is the latest in a string of Black men being misidentified and wrongly arrested due to shoddy facial recognition tech.

Evidently, at least a few federal agencies are using these systems with barely any oversight.

“When we requested information from one of the agencies about its use of non-federal systems, agency officials told us they had to poll field division personnel because the information was not maintained by the agency,” the GAO’s report continues. “Officials from another agency initially told us that its employees did not use non-federal systems; however, after conducting a poll, the agency learned that its employees had used a non-federal system to conduct more than 1,000 facial recognition searches.”

While many of these searches were used to apprehend suspected bad actors and for general surveillance-y purposes — there were no less than six agencies monitoring protestors in the aftermath of George Floyd’s killing, for example — the tech has more innocuous uses, too. The Transportation Security Authority (TSA) is using the tech to identify travellers before they board their flights. The Federal Bureau of Prisons (BOP) uses facial recognition among its staffers to authenticate officer’s identities before they step into secure facilities. Per the GAO’s report, administrative officers used facial recognition tech to keep tabs on people under court-mandated supervision when they couldn’t meet in person due to the ongoing pandemic.

But regardless of use, the GAO said, every agency needs more oversight. At the very least, they should know what kind of tech they’re using, and who’s providing it — not only to hold cops accountable when these systems go wrong but to hold the tech companies behind the tech accountable, as well.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.