In 2021, Gig Workers Were Forced to Endure ‘Unprecedented Surveillance’

In 2021, Gig Workers Were Forced to Endure ‘Unprecedented Surveillance’
Photo: Siegfried Modola, Getty Images

Pa Edrissa Manjang, an Uber driver in the UK, spent a year of his life working for the ride-hailing company before he was abruptly fired by an algorithm. “It wasn’t good at all,” Manjang said in an interview with Worker Info Exchange. “It felt like most of the time you were dealing with robots.”

In a newly released report, Manjang claims he was terminated after Uber’s facial recognition verification system failed to recognise the photos he submitted to the app. Uber put this verification system in place as a safety measure to ensure customers that their drivers are who they say they are, but in this case, and others like it, the detection system got it wrong. Manjang, who is Black and knew facial recognition systems in general struggle to identity non-white users, appealed the case and insisted on having a human review his photos, but claims he was unsuccessful.

“It’s not what we’re used to,” Manjang said in the report. “I’ve worked with the government and public companies in this country. You have that access to your employer, but with Uber, it’s not the case. You feel like you’re working for a computer.”

Manjang’s story is emblematic of a wider dilemma plaguing gig workers all around the world detailed in a new 82-page report released Monday by Worker Info Exchange, Privacy International, and App Drivers and Couriers Union titled, Managed by Bots: Data-Driven Exploitation in the Gig Economy. The report details the plethora of ways gig workers are regularly subjected to all-day, “unprecedented surveillance” techniques required to complete their jobs. Even worse, many of these workers find themselves on the receiving end of surveillance systems even while they are off the clock waiting to accept a new job.

Though the specific types of monitoring techniques described vary widely, the report dives deep into fraud detection software and facial recognition verification systems, both of which are growing in popularity. Facial recognition systems in particular are often billed by app makers as a means to bolster security, but the report claims actual instances of gig workers trying to circumvent rules are relatively few and far between.

“The introduction of facial recognition technology by the industry has been entirely disproportionate relative to the risk perceived,” the report’s authors argue.

The report also details the way apps are increasingly using AI systems to perform roles once typically associated with a manager, in some cases even going as far as to fire workers. The report interrogates gig work firms’ use of algorithms to conduct management and dictate pricing through the use of digital driver monitoring techniques like GPS, customer ratings, and job completion. In Uber’s case, drivers’ past preferences and behaviour can also reportedly play a factor in whether the apps direct a driver to a customer.

Researchers also found accounts of workers unjustly terminated due to geolocation checks that falsely accused drivers of attempting to fraudulently share their accounts. These examples point to both the intently monitored nature of the apps and the real-world consequences of AI-induced management decisions.

“Platform companies are operating in a lawless space where they believe they can make the rules,” Open Society Foundations Fellow Bama Athreya said. “Unfortunately, this isn’t a game; virtual realities have harsh consequences for gig workers in real life.”

Aside from contributing to an environment that makes workers increasingly feel like unvalued automatons, the continued outsourcing of key management decisions to AI systems could also potentially run afoul of some European legal protections.

Specifically, the report claims to have seen an increased amount of AI-driven worker dismissals throughout the gig industry, which they argue may be a violation of Article 22 of the European Union’s General Data Protection Regulation (GDPR). Under that provision, workers can’t be subjected to legal decisions based on automated data processing alone.

Article 20 of the GDPR, meanwhile, states that subjects (in this case, the gig workers) have the right to receive the data that they’ve provided. And while most gig work apps do provide their workers with some data, the report’s authors claim they often stop short of providing the data necessary for the drivers to meaningfully dispute their pay or other working conditions. In other cases, workers have to navigate through a maze of complex websites just to access the data they are supposedly guaranteed. The report argues there’s currently an “informational asymmetry” where app makers possess all of the necessary data while the drivers themselves are often being left in the dark.

While this may all sound pretty bleak for gig workers concerned with digital monitoring, there are some optimistic legal actions and changes brewing.

Earlier this year, Italy’s data protection authority took action against gig work company Deliveroo, issuing a fine of $US2.5 ($4) million for allegedly violating GDPR protections. In its ruling, the agency said the company lacked transparency around the way its algorithms were used to assign workers orders and book shifts.

In Spain, lawmakers recently approved a landmark law that would force delivery platforms to hire around 30,000 couriers that were previously viewed as independent contractors and provide more transparency around how algorithms are used in management. As part of the new law, companies will be required to give workers or their legal representatives information about how algorithms are used to assess their job performance. Meanwhile, in the UK, the country’s Supreme Court upheld a ruling earlier this year forcing Uber to classify its drivers as “workers,” rather than independent contractors, a distinction that grants them added labour protections.

There’s some movement around algorithmic transparency in the U.S. as well. Just last month, New York’s City council passed a first-of-its-kind bill that prohibits employers from using AI screening tools to hire candidates unless those tools have undergone a bias audit.

The gig worker report makes clear these issues of worker surveillance and AI management are now a part of this imbalanced ecosystem, particularly as more and more traditional employers are eyeing the gig work model as an attractive business opportunity. In that context, the authors argue employment rights become “inextricably linked with the exercise of data rights.”