Virtual assistants are getting scarier — I mean, fancier — with rumoured features like facial recognition and integrated Google services. Then there’s Alexa, which can’t even perform a Google search. As the arms race ramps up, Amazon hopes like Alexa best if she can tell when you’re mad and apologise quickly enough for not knowing which album you want her to play.
A source familiar with Alexa, or the Amazon Echo, tells MIT Technology Review that researchers are working on natural-language-processing updates that will help it detect emotion in someone’s voice, as well as remember and connect known information about a user to her requests. For example, if Alexa knows that a user lives in Seattle, it will factor in that information when deciding how to answer the question “How are the Hawks doing?” Or, if Alexa knows that you like to listen to Kanye, it will be more likely to recognise requests for his music in the future.
But Alexa will invariably mess something up, and that’s where the emotion-detection technology comes in. The software itself isn’t new — just think about all the times an automated telephone system has said, “I’m sorry, I don’t quite understand” after you lost your temper and started yelling. Amazon is simply hoping to incorporate a better and more sensitive version of this into Alexa.
All this is definitely part of Amazon’s attempt to improve Alexa as competitors are (supposedly) about to hit the market. Google’s rumoured Echo competitor makes almost too much sense given good Google’s voice recognition and search capabilities are. But at least Alexa might be the first to say “sorry” when you get mad at how dumb it is?