Skip to main content

AI assistants will soon recognize and respond to the emotion in your voice

You know when people say that it’s not what you say, but how you say it that matters? Well, very soon that could become a part of smart assistants such as Amazon’s Alexa or Apple’s Siri. At least, it could if these companies decide to use new technology developed by emotion tracking artificial intelligence company Affectiva.

Affectiva’s work has previously focused on identifying emotion in images by observing the way that a person’s face changes when they express particular sentiments. Affectiva’s latest technology builds on that premise through the creation of a cloud-based application program interface (API) that is able to detect emotion in speech. Developed using the power of deep learning technology, the smart tech is capable of observing changes in tone, volume, speed, and voice quality and using this to recognize emotions like anger, laughter, and arousal in recorded speech.

Recommended Videos

“The addition of Emotion AI for speech builds on Affectiva’s existing emotion recognition technology for facial expressions, making us the first AI company to allow for a person’s emotions to be measured across face and speech,” Rana el Kaliouby, co-founder and CEO of Affectiva, told Digital Trends. “This is all part of a larger vision that we have. People sense and express emotion in many different ways: Through facial expressions, voice, and gestures. We’ve set out to develop multi-modal Emotion AI that can detect emotion the way humans do from multiple communication channels. The launch of Emotion AI for speech takes us one step closer.”

Affectiva Overview

Affectiva developed its voice recognition system by collecting naturalistic speech data from a variety of sources, including commercially available databases. This data was then labeled by human experts for the occurrence of what the company calls “emotion events.” These human generated labels were used to train and validate the team’s deep learning models, so that over time it grew to understand how certain shifts in a person’s voice might indicate a particular emotion.

It’s smart stuff from a technology perspective but, like the best technology, it also has the possibility of helping users on a practical basis. One specific application could include car navigation systems that are able to hear a driver start to experience road rage, and react to prevent them from making a rash driving decision. It could similarly be used to allow automated assistants to change their approach when they hear anger or frustration from a user — or to learn what kind of responses elicit the best reactions and repeat these strategies.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Apple’s next major health bet could be an AI doctor
Someone holding an iPhone with the Apple Fitness app open, showing the Custom Plans feature.

Apple’s efforts in the health segment are a class ahead of the competition. But more than just racing ahead with innovation, the company has taken a more holistic approach that focuses on deep collaboration with experts, thorough validation, and long-term collaboration with its user community on medical studies.
The new hearing aid facility on AirPods is one such fresh specimen of Apple’s efforts. Then you have features like fall detection, looking for signs of non-rhythmic heart activity, and more. The next big bet from Apple could be an AI agent that acts like a doctor and might arrive as early as next year.
“The initiative is called Project Mulberry, and it involves a completely revamped Health app plus a health coach. The service would be powered by a new AI agent that would replicate — at least to some extent — a real doctor,” says Bloomberg.

What to expect from an Apple AI coach?

Read more
Clinical test says AI can offer therapy as good as a certified expert
Interacting with Therabot AI App.

AI is being heavily pushed into the field of research and medical science. From drug discovery to diagnosing diseases, the results have been fairly encouraging. But when it comes to tasks where behavioral science and nuances come into the picture, things go haywire. It seems an expert-tuned approach is the best way forward.
Dartmouth College experts recently conducted the first clinical trial of an AI chatbot designed specifically for providing mental health assistance. Called Therabot, the AI assistant was tested in the form of an app among participants diagnosed with serious mental health problems across the United States.
“The improvements in symptoms we observed were comparable to what is reported for traditional outpatient therapy, suggesting this AI-assisted approach may offer clinically meaningful benefits,” notes Nicholas Jacobson, associate professor of biomedical data science and psychiatry at the Geisel School of Medicine.

A massive progress

Read more
Opera One puts an AI in control of browser tabs, and it’s pretty smart
AI tab manager in Opera One browser.

Opera One browser has lately won a lot of plaudits for its slick implementation of useful AI features, a clean design, and a healthy bunch of chat integrations. Now, it is putting AI in command of your browser tabs, and in a good way.
The new feature is called AI Tab Commands, and it essentially allows users to handle their tabs using natural language commands. All you need to do is summon the onboard Aria AI assistant, and it will handle the rest like an obedient AI butler.
The overarching idea is to let the AI handle multiple tabs, and not just one. For example, you can ask it to “group all Wikipedia tabs together,” “close all the Smithsonian tabs,” “or shut down the inactive tabs.”

A meaningful AI for web browsing
Handling tabs is a chore in any web browser, and if internet research is part of your daily job, you know the drill. Having to manually move around tabs using a mix of cursor and keyboard shorcuts, naming them, and checking through the entire list of tabs is a tedious task.
Meet Opera Tab Commands: manage your tabs with simple prompts
Deploying an AI do it locally — and using only natural language commands — is a lovely convenience and one of the nicest implementations of AI I’ve seen lately. Interestingly, Opera is also working on a futuristic AI agent that will get browser-based work done using only text prompts.
Coming back to the AI-driven tab management, the entire process unfolds locally, and no data is sent to servers, which is a neat assurance. “When using Tab Commands and asking Aria to e.g. organize their tabs, the AI only sends to the server the prompt a user provides (e.g., “close all my YouTube tabs”) – nothing else,” says the company.
To summon the AI Tab manager, users can hit the Ctrl + slash(/) shortcut, or the Command + Slash combo for macOS. It can also be invoked with a right-click on the tabs, as long as there are five or more currently running in a window.
https://x.com/opera/status/1904822529254183166?s=61
Aside from closing or grouping tabs, the AI Tab Commands can also be used to pin tabs. It can also accept exception commands, such as “close all tabs except the YouTube tabs.” Notably, this feature is also making its way to Opera Air and the gaming-focused Opera GX browser, as well.
Talking about grouping together related tabs, Opera has a neat system called tab islands, instead of color-coded tab groups at the top, as is the case with Chrome or Safari. Opera’s implementation looks better and works really well.
Notably, the AI Tab Commands window also comes with an undo shortcut, for scenarios where you want to revert the actions, like reviving a bunch of closed tabs. Opera One is now available to download on Windows and macOS devices. Opera also offers Air, a browser than puts some zen into your daily workflow.

Read more