Intel is learning a tough lesson after partnering with Classroom Technologies to develop a face-reading AI that detects the emotions of students on Zoom calls.
The student engagement technology, created by Intel with Classroom Technologies’ Class software, captures images of students’ faces with their webcams and combines them with computer vision technology and contextual information to predict engagement levels through emotions.
The goal is to provide educators with emotional response data they can use to customise lessons and improve student engagement. The AI might detect that students become confused during a specific section of a lesson and send that information to teachers so they can reassess how that particular subject is being taught.
“Intel is dedicated to ensuring teachers and students have access to the technologies and tools needed to meet the challenges of the changing world,” Michael Campbell, Intel’s global director for the education consumer and commercial segments, said. “Through technology, we have the ability to set the standard for impactful synchronous online learning experiences that empower educators.”
Classroom Technologies CEO Michael Chasen says teachers have trouble engaging with students in a pandemic-era virtual classroom, and that the insights offered by this AI tech can help educators better communicate. Classroom Technologies plans to test the emotion-reading technology, which Intel hopes to develop into a product for widespread distribution.
As detailed in a Protocol report, this face-reading AI already has its critics, who argue that using face recognition technology on students is an invasion of privacy and that the technology oversimplifies human emotion, which can lead to damaging results.
As learning has shifted from the classroom to the home, schools have desperately searched for new ways to engage with students. An early debate revolved around the use of webcams. Those in favour argued that face-to-face interaction improved learning and forced accountability, while those against the use of webcams said it was a breach of privacy and could increase stress and anxiety levels. Reading students’ faces and analysing them with AI adds another layer to the problem, critics say.
“I think most teachers, especially at the university level, would find this technology morally reprehensible, like the panopticon,” Angela Dancey, a senior lecturer at the University of Illinois Chicago, told Protocol. “Frankly, if my institution offered it to me, I would reject it, and if we were required to use it, I would think twice about continuing to work here.”
These criticisms arrive at a time when schools are abandoning invasive proctoring software that exploded during the pandemic as students were forced to learn remotely. Often used to discourage cheating, these tools use webcams to monitor eye and head movements, tap microphones to listen to the room, and record every mouse click and keystroke. Students around the country have signed petitions arguing the technology is an invasion of privacy, discriminates against minorities, and punishes those with disabilities, as Motherboard reports.
There is also the question of whether facial expressions can be accurately used to assess engagement. Researchers have found that people express themselves in immeasurable ways. As such, critics argue that emotions can’t be determined based solely on facial expressions. Assuming that a student has tuned out of a lesson simply because they look uninterested to your algorithm’s metrics is reductive of the complexities of emotion.
“Students have different ways of presenting what’s going on inside of them,” Todd Richmond, a professor at the Pardee RAND Graduate School, said speaking to Protocol. “That student being distracted at that moment in time may be the appropriate and necessary state for them in that moment in their life.”
There is also some concern that analytics provided by AI could be used to penalise students. If, say, a student is deemed to be distracted, they might get poor participation scores. And teachers might feel incentivized to use the data should a school system evaluate educators by the engagement scores of their students.
Intel created the emotional analytics technology using data captured in real-life classrooms using 3D cameras, and worked with psychologists to categorise facial expressions. Some teachers have found the AI to be helpful, but Chasen says he doesn’t think Intel’s system has “reached its maturity yet” and needs more data to determine whether the results the AI spits out actually match the performance of students. Chasen says Intel’s tech will be only one piece of a larger puzzle in assessing students.
Intel and Classroom Technologies claim their technology wasn’t designed as a surveillance system or to be used as evidence to penalise students, but as we so often see in the tech industry, products are frequently used in ways not intended by their creators.
We’ve reached out to Classroom Technologies for comment and will update this story when we hear back.