Uber Is Being Sued Over Its “Racist” Facial Recognition System

Uber Is Being Sued Over Its “Racist” Facial Recognition System

Rideshare company Uber is back in headlines once again, and for the wrong reasons: the company has been accused of using a racially biased facial recognition algorithm. The United Kingdom labour union, the Independent Workers’ Union (IWGB), is taking the company to court over it, Vice reports.

Vice was provided with the labour union’s complaint, which alleges that Uber’s “Real Time ID Check” has a serious racial bias. That check requires drivers to submit the occasional selfie through the Uber app to verify their identity, but one driver of colour and member of the IWGB failed two consecutive ID checks after submitting two photos of himself.

“After submitting his photograph through the App, the Claimant received a message from Uber stating that he had failed to verify his identity and that his account had been waitlisted for 24 hours,” the complaint reads. “On 14 April 2021 the Claimant was informed by Uber that his account had been deactivated after the second attempt at verification.

“Before the decision was taken, the claimant was never offered a human facial recognition check.”

The driver is also said to have gone to Uber’s London office to challenge the deactivation, and the Uber staff member he spoke to agreed that the deactivation was in error. However, the staff member claimed they couldn’t do anything. Uber then ignored an email sent by Nader Awaad, chair of the IWGB’s driver branch, that requested a sit-down between Uber and the IWGB to hash things out in person.

This isn’t the first time this has happened to drivers. Earlier this year, 14 UberEats couriers who were people of colour told Wired that their accounts had been frozen and occasionally terminated because they failed that ID check.

Facial recognition algorithms have been shown to be racially biased in the past. One Harvard report confirms that these issues have appeared across the board, be it for airport passenger screening, law enforcement surveillance, and employment or housing decisions. Black women aged 18-30 are incorrectly identified via facial recognition algorithms more than any other demographic. Lighter-skinned people are found to be more correctly identified than their peers with darker skin.