Alexa may soon know how irritated you are

It can be incredibly frustrating when a virtual assistant repeatedly misunderstands what you’re saying. Soon, though, some of them might at least be able to hear the irritation in your voice, and offer an apology.

Amazon is working on significant updates to Alexa, the virtual helper that lives inside the company’s voice-controlled home appliance, called Amazon Echo. These will include better language skills and perhaps the ability to recognize the emotional tenor of your voice.

Researchers have long predicted that emotional cues could make machine interfaces much smarter, but so far such technology has not been incorporated into any consumer technology.

Rosalind Picard, a professor at MIT’s Media Lab, says adding emotion sensing to personal electronics could improve them: “Yes, definitely, this is spot on.” In a 1997 book, Affective Computing, Picard first mentioned the idea of changing the voice of a virtual helper in response to a user’s emotional state. She notes that research has shown how matching a computer’s voice to that of a person can make communication more efficient and effective. “There are lots of ways it could help,” she says.

The software needed to detect the emotional state in a person’s voice exists already. For some time, telephone support companies have used such technology to detect when a customer is becoming irritated while dealing with an automated system. In recent years, new machine-learning techniques have improved the state of the art, making it possible to detect more emotional states with greater accuracy, although the approach is far from perfect.

Here is the full story.  Here is my recent New Yorker piece on how talking bots will affect us.


I can't wait. I work from home and am on a lot of calls and there are at least two times I've yelled at Alexa to STOP and had my human interlocutars think I was yelling at my dog or kid.

The new Alexa integrates the latest advances in artificial passive-aggressive technology.

Well, if that's really what you want to do. Okay. Up to you. It's your decision.

If that's what you want, we'll do it your way. Fine. Just don't blame me if it doesn't work out the way you think it will.

This is an interesting topic. I'm not saying this just to say it or to be able to claim to be the first to comment. It really is interesting and while I have nothing to contribute (I have a a very out of date linguistics Ph.D. [Chomsky's minimalism was state of the art at the time], I'm not involved in this or similar research), I'm looking forward to smart MR commenters who know something about it throwing their 2 cents in.

'In a 1997 book, Affective Computing, Picard first mentioned the idea of changing the voice of a virtual helper in response to a user’s emotional state.'

Well, Picard is fitting, as 30 years before that, a Star Trek episode aired, with a running gag of the ship's computer's voice responding emotionally to the user -

The more modern tech dilemmas, the more respect I have for those Star Trek folks. I thought it was just geeky stuff back in the day. But we should have been putting those folks in charge of advanced problem solving on future problems.

Yup, colorful floppy disks and voice recognition to offer input to the computer, both in TOS. I wonder if Prof. Picard asks her computer for "tea, Earl Grey, hot".

They should work on the underlying problem instead, voice recognition and comprehension are still not that good. Like 3D printing it's been over hyped.

"I'm sorry Dave..."




Watson exploded while trying to analyze why you have put line breaks at random places in your comment.

Watson responds: Double space line breaks retain a format; single space line breaks run the text together.

Single space
Line break

Double space

Line Break

Tyler Cowen's New Yorker essay is a fascinating read. The last sentence ("we may end up treating people more like bots") is mesmerizing, but I wonder, if on balance (considering how badly we treat each other), this possibility might be a good thing!

Comments for this post are closed