Emotion Recognition Is Creepy As Hell

Emotion Recognition Is Creepy As Hell

Emotion recognition technology, at best, promises to read commuters’ mental anguish and adjust subway cabin conditions accordingly, and at worst, puts biased and buggy mental microscopes in the hands of corporate overlords. In a new report, the NYU research centre AI Now calls for regulators to ban the tech.

AI Now points out that scientifically-suspect emotion recognition, or, affect recognition—which claims to detect personality traits and emotions, and assess mental health—allows institutions to fine-tune biometric data. The report argues that these institutions could potentially wield the data to make decisions about our fitness to participate in society, such as “who is interviewed or hired for a job, the price of insurance, patient pain assessments, or student performance in school.”

The thought of a job interviewer reading your mind is freakish enough, but aside from that, this shit doesn’t even work. The report cites a ProPublica piece finding that “aggression detectors” developed by Sound Intelligence—which have been implemented at schools, prisons, hospitals, and banks—read coughs as signs of aggression.

This tech is also demonstrably biased. The report also points to a study which ran a set of NBA players’ photos through Face++ and Microsoft’s Face API, both of which deemed black players to have more negative emotional scores than all other players. According to the study, Face++ deemed black players more “aggressive,” and Microsoft’s Face API classified them as having more “contempt”—despite smiles.

Additionally, in November, the Harvard Business Review reported on China’s use of emotional recognition in schools to track students’ focus and the technology’s inadequacy: “Think about different learning styles: Some people are visual learners. Some learn by doing. Others favour intense solitary concentration. But an algorithm, perhaps designed by a visual learner, might completely miss or misinterpret such cues.”

“There remains little to no evidence that these new affect-recognition products have any scientific validity,” AI Now writes.

AI Now also echo longstanding calls for governments and businesses to halt the use of facial recognition “in sensitive social and political contexts” until it’s better understood. They urge the industry to get serious about racism and misogyny in the companies’ hiring practices that skew algorithms used to pump out garbage data.

AI Now would like to see more regulation like the Illinois Biometric Information Privacy Act (BIPA), which allows people to sue for nonconsensual collection and use of biometric data by private actors for purposes including tracking, surveillance, and facial recognition profiling.

They also seek to empower workers to push back against exploitative surveillance (namely, through unionisation). They also call for tech employees to be informed that they’re building spy tools and for governments to force AI companies to submit climate impact reports. (That last part is not insignificant in the scope of AI’s burdens; they estimate that “one AI model for natural-language processing can emit as much as 600,000 pounds of carbon dioxide.”)

The largely-unregulated industry of emotion analytics is projected to take in $US25 ($36) billion in 2023 and looms over every area of daily life: Disney has used it to track audience responses to its movies, researchers have turned to it to diagnose mental health conditions, and companies utilised it to scrutinize customer service representatives’ conversations.

In 2016, Apple acquired the facial recognition start-up Emotient, which has developed “Facet,” a tool that claims to catch subconscious micro-expressions and process extremely low-res images.

Authorities in China are already using facial recognition to identify and track Muslim minorities and (quite likely) Hong Kong protesters.

The addition of emotional recognition would add another dystopian layer to this reality, the scope of which isn’t limited to countries outside the U.S. Facial recognition will inevitably pervade at TSA checkpoints and police body cams if left unchecked.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.