Alexa may soon know how irritated you are

It can be incredibly frustrating when a virtual assistant repeatedly misunderstands what you’re saying. Soon, though, some of them might at least be able to hear the irritation in your voice, and offer an apology.

Amazon is working on significant updates to Alexa, the virtual helper that lives inside the company’s voice-controlled home appliance, called Amazon Echo. These will include better language skills and perhaps the ability to recognize the emotional tenor of your voice.

Researchers have long predicted that emotional cues could make machine interfaces much smarter, but so far such technology has not been incorporated into any consumer technology.

Rosalind Picard, a professor at MIT’s Media Lab, says adding emotion sensing to personal electronics could improve them: “Yes, definitely, this is spot on.” In a 1997 book, Affective Computing, Picard first mentioned the idea of changing the voice of a virtual helper in response to a user’s emotional state. She notes that research has shown how matching a computer’s voice to that of a person can make communication more efficient and effective. “There are lots of ways it could help,” she says.

The software needed to detect the emotional state in a person’s voice exists already. For some time, telephone support companies have used such technology to detect when a customer is becoming irritated while dealing with an automated system. In recent years, new machine-learning techniques have improved the state of the art, making it possible to detect more emotional states with greater accuracy, although the approach is far from perfect.

Here is the full story.  Here is my recent New Yorker piece on how talking bots will affect us.

Comments

Comments for this post are closed