Skip to main content

Machines are getting freakishly good at recognizing human emotions

Until very recently we’ve had to interact with computers on their own terms. To use them, humans had to learn inputs designed to be understood by the computer: whether it was typing commands or clicking icons using a mouse. But things are changing. The rise of A.I. voice assistants like Siri and Alexa make it possible for machines to understand humans as they would ordinarily interact in the real world. Now researchers are reaching for the next Holy Grail: Computers that can understand emotions.

Whether it’s Arnold Schwarzenegger’s T-1000 robot in Terminator 2 or Data, the android character in Star Trek: The Next Generation, the inability of machines to understand and properly respond to human emotions has long been a common sci-fi trope. However, real world research shows that machine learning algorithms are actually getting impressively good at recognizing the bodily cues we use to hint at how we’re feeling inside. And it could lead to a whole new frontier of human-machine interactions.

Affectiva

Don’t get us wrong: Machines aren’t yet as astute as your average human when it comes to recognizing the various ways we express emotions. But they’re getting a whole lot better. In a recent test carried out by researchers at Dublin City University, University College London, the University of Bremen and Queen’s University Belfast, a combination of people and algorithms were asked to recognize an assortment of emotions by looking at human facial expressions.

Recommended Videos

The emotions included happiness, sadness, anger, surprise, fear, and disgust. While humans still outperformed machines overall (with an accuracy of 73% on average, compared to 49% to 62% depending on the algorithm), the scores racked up by the various bots tested showed how far they have come in this regard. Most impressively, happiness and sadness were two emotions at which machines can outperform humans at guessing, simply by looking at faces. That’s a significant milestone.

Emotions matter

Researchers have long been interested in finding out whether machines can identify emotion from still images or video footage. But it is only relatively recently that a number of startups have sprung up to take this technology mainstream. The recent study tested commercial facial recognition machine classifiers developed by Affectiva, CrowdEmotion, FaceVideo, Emotient, Microsoft, MorphCast, Neurodatalab, VicarVision, and VisageTechnologies. All of these are leaders in the growing field of affective computing, a.k.a. teaching computers to recognize emotions.

The test was carried out on 938 videos, including both posed and spontaneous emotional displays. The chance of a correct random guess by the algorithm for the six emotion types would be around 16%.

Damien Dupré, an Assistant Professor at Dublin City University’s DCU Business School, told Digital Trends that the work is important because it comes at a time when emotion recognition technology is becoming more relied upon.

“Since machine learning systems are becoming easier to develop, a lot of companies are now providing systems for other companies: mainly marketing and automotive companies,” Dupré said. “Whereas [making] a mistake in emotion recognition for academic research is, most of the time, harmless, stakes are different when implanting an emotion recognition system in a self-driving car, for example. Therefore we wanted to compare the results of different systems.”

It could one day be used to spot things like drowsiness or road rage, which might trigger a semi-autonomous car taking the wheel.

The idea of controlling a car using emotion-driven facial recognition sounds, frankly, terrifying — especially if you’re the kind of person prone to emotional outbursts on the road. Fortunately, that’s not exactly how it’s being used. For instance, emotion recognition company Affectiva has explored the use of in-car cameras to identify emotion in drivers. It could one day be used to spot things like drowsiness or road rage, which might trigger a semi-autonomous car taking the wheel if a driver is deemed unfit to drive.

Researchers at the University of Texas at Austin, meanwhile, have developed technology that curates an “ultra-personal” music playlist that adapts to each user’s changing moods. A paper describing the work, titled “The Right Music at the Right Time: Adaptive Personalized Playlists Based on Sequence Modeling,” was published this month in the journal MIS Quarterly. It describes using emotion analysis that predicts not just which songs will appeal to users based on their mood, but the best order in which to play them, too.

Affectiva

There are other potential applications for emotion recognition technology, too. Amazon, for instance, has very recently begun to incorporate emotion-tracking of voices for its Alexa assistant; allowing the A.I. to recognize when a user is showing frustration. Further down the line, there’s the possibility this could even lead to full-on emotionally responsive artificial agents, like that in Spike Jonze’s 2013 movie Her.

In the recent image-based emotion analysis work, emotion sensing is based on images. However, as some of these illustrations show, there are other ways that machines can “sniff out” the right emotion at the right time.

“When facial information is for some reason unavailable, we can analyze the vocal intonations or look at the gestures.”

“People are generating a lot of non-verbal and physiological data at any given moment,” said George Pliev, founder and managing partner at Neurodata Lab, one of the companies whose algorithms were tested for the facial recognition study. “Apart from the facial expressions, there are voice, speech, body movements, heart rate, and respiration rate. A multimodal approach states that behavioral data should be extracted from different channels and analyzed simultaneously. The data coming from one channel will verify and balance the data received from the other ones. For example, when facial information is for some reason unavailable, we can analyze the vocal intonations or look at the gestures.”

Challenges ahead?

However, there are challenges — as all involved agree. Emotions are not always easy to identify; even for the people experiencing them.

“If you wish to teach A.I. how to detect cars, faces or emotions, you should first ask people what do these objects look like,” Pliev continued. “Their responses will represent the ground truth. When it comes to identifying cars or faces, almost 100% of people asked would be consistent in their replies. But when it comes to emotions, things are not that simple. Emotional expressions have many nuances and depend on context: cultural background, individual differences, the particular situations where emotions are expressed. For one person, a particular facial expression would mean one thing, while another person may consider it differently.”

Dupré agrees with the sentiment. “Can these systems [be guaranteed] to recognize the emotion actually felt by someone?” he said. “The answer is not at all, and they will never be! They are only recognizing the emotion that people are deciding to express — and most of the time that doesn’t correspond to the emotion felt. So the take-away message is that [machines] will never read … your own emotion.”

Still, that doesn’t mean the technology isn’t going to be useful. Or stop it from becoming a big part of our lives in the years to come. And even Damien Dupré leaves slight wiggle room when it comes to his own prediction that machines will never achieve something: “Well, never say never,” he noted.

The research paper, “Emotion recognition in humans and machine using posed and spontaneous facial expression,” is available to read online here.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Global EV sales expected to rise 30% in 2025, S&P Global says
ev sales up 30 percent 2025 byd sealion 7 1stbanner l

While trade wars, tariffs, and wavering subsidies are very much in the cards for the auto industry in 2025, global sales of electric vehicles (EVs) are still expected to rise substantially next year, according to S&P Global Mobility.

"2025 is shaping up to be ultra-challenging for the auto industry, as key regional demand factors limit demand potential and the new U.S. administration adds fresh uncertainty from day one," says Colin Couchman, executive director of global light vehicle forecasting for S&P Global Mobility.

Read more
Faraday Future could unveil lowest-priced EV yet at CES 2025
Faraday Future FF 91

Given existing tariffs and what’s in store from the Trump administration, you’d be forgiven for thinking the global race toward lower electric vehicle (EV) prices will not reach U.S. shores in 2025.

After all, Chinese manufacturers, who sell the least expensive EVs globally, have shelved plans to enter the U.S. market after 100% tariffs were imposed on China-made EVs in September.

Read more
What to expect at CES 2025: drone-launching vans, mondo TVs, AI everywhere
CES 2018 Show Floor

With 2024 behind us, all eyes in tech turn to Las Vegas, where tech monoliths and scrappy startups alike are suiting up to give us a glimpse of the future. What tech trends will set the world afire in 2025? While we won’t know all the details until we hit the carpets of the Las Vegas Convention Center, our team of reporters and editors have had an ear to the ground for months. And we have a pretty good idea what’s headed your way.

Here’s a sneak peek at all the gizmos, vehicles, technologies, and spectacles we expect to light up Las Vegas next week.
Computing

Read more