Skip to main content

Stop Alexa with a wave of your hand with Elliptic Labs’ ultrasound technology

Mobile World Congress 2024
Read our complete coverage of Mobile World Congress

Elliptic Labs has been showing off a new application for its ultrasound-based gesture technology at MWC in Barcelona, and we caught up with the company to get a demo. The idea is that smart speakers with ultrasound virtual sensor technology inside can detect the presence of people and respond to a range of gestures.

Recommended Videos

Using a prototype consisting of a speaker with Amazon’s Alexa onboard and a Raspberry Pi, Elliptic Labs showed us how you can trigger Alexa with a double tap palm gesture or cut it off in mid-flow with a single palm tap. The gestures can work from some distance away, allowing you to control your smart speaker without having to touch it or utter a word.

If you’re unfamiliar with Elliptic Labs, we met up with the company a couple of years back when it first began to roll its ultrasound gestures out into phones. The hope was that ultrasound might replace proximity sensors in phones and the technology was subsequently integrated into Xiaomi’s Mi Mix handsets, allowing the manufacturer to shrink the bezels right down. The ultrasound sensor can detect when your hand or face is near and turn the screen on or off accordingly. Specific gestures can also be used to scroll around, snap selfies, or even play games.

Please enable Javascript to view this content

With more microphones, Elliptic Labs tech can detect more specific gestures or positioning. In a phone with two microphones, this might allow you to wave your hand to turn the volume up or down. Most smart speakers have several microphones now, so there’s a great deal of potential for more gesture controls, or even for triggering specific actions when someone enters or leaves a room.

Elliptic Labs sees ultrasound as free spectrum that’s not being exploited right now and the company is very optimistic about the potential applications.

“Any space where there are humans is fair game,” Guenael Strutt, Elliptic Labs’ VP of Product Development, told Digital Trends. “The possibilities are infinite.”

In the second demonstration we saw at MWC, the smart speaker was hooked up to a light. By placing your hand on one side of the speaker and holding it there you could turn up the light level, while holding your hand at the other side dimmed the bulb. It’s easy to imagine how this same gesture could work to tweak volume levels.

We tested out both prototypes for ourselves and found them very easy and intuitive to use. The technology doesn’t require direct line of sight, because the sound can bounce off a wall, so even if your speaker is tucked behind a lamp or the arm of the couch, you can still use these gestures to control it. We think the stop gesture is the most potentially useful, because it can be tricky to use voice commands to stop Alexa when it starts speaking or plays the wrong song.

There’s no official support for ultrasound tech in smart speakers just yet, but Elliptic Labs has been talking to all the major players – Amazon, Google, and Apple. The company has also been working with chip manufacturers like Qualcomm and suppliers further up the chain in smart speaker manufacture to try and integrate the technology into the chipsets and components that go into smartphones and smart speakers.

Having tried it out, we expect more manufacturers to adopt it in the near future. Smart speakers may be an easier sell than smartphones, though, unless Elliptic Labs can get ultrasound technology into the chipsets that manufacturers buy.

One of the key challenges for smartphones is reducing the power draw of the ultrasound sensor and working out clever ways to determine when it should be listening. Advances in machine learning and processor speed could make an important difference here and Elliptic Labs has been working to determine the optimal model for gesture detection.

We’re excited to see what these ultrasound pioneers come up with next.

Simon Hill
Former Digital Trends Contributor
Simon Hill is an experienced technology journalist and editor who loves all things tech. He is currently the Associate Mobile…
Gemini brings a fantastic PDF superpower to Files by Google app
step of Gemini processing a PDF in Files by Google app.

Google is on a quest to push its Gemini AI chatbot in as many productivity tools as possible. The latest app to get some generative AI lift is the Files by Google app, which now automatically pulls up Gemini analysis when you open a PDF document.

The feature, which was first shared on the r/Android Reddit community, is now live for phones running Android 15. Digital Trends tested this feature on a Pixel 9 running the stable build of Android 15 and the latest version of Google’s file manager app.

Read more
OnePlus 13 vs. iPhone 16 Pro: Can the flagship killer take another head?
OnePlus 13 in Midnight Ocean beside iPhone 16 Pro in Natural Titanium.

OnePlus looks like it's hit another one out of the park with this year's OnePlus 13. The enthusiast brand's latest flagship launched in China in late October, and this week, the company officially announced it will be landing in North America on January 7, 2025. As one of the first mainstream phones to be powered by Qualcomm's bleeding-edge Snapdragon 8 Elite chip, it should bring significant improvements in the OnePlus 13's performance, battery life, and photographic prowess compared to its predecessor.

This also puts the OnePlus 13 first in line to challenge Apple's 2024 flagship. This year, the iPhone 16 Pro has raised the bar with Apple's A18 Pro chip that powers new Apple Intelligence features and turns the smartphone into a gaming powerhouse. There's also a clever new Camera Control and studio-quality cinematography features. Does Qualcomm's latest silicon give the OnePlus 13 enough of an edge, and has the smartphone maker put it to good use? Let's dig in and find out how these two measure up to each other.
OnePlus 13 vs. iPhone 16 Pro: specs

Read more
I tracked my sleep with a smart display, ring, and watch. This is my favorite
The Oura Ring app on an iPhone 16 Pro Max, showing the Sleep screen.

Since I had a heart attack four years ago, I’ve been on a journey to understand my health. A crucial part of my recovery and focus has been my sleep, and it'smade even more important by the fact that my heart attack took place in the middle of the night while I was fast asleep. Thankfully, I woke up, but our sleep can tell us a lot about our underlying health.

Virtually every wearable now offers some form of sleep tracking, but like most things in technology, not all devices are created equal. Beyond just data, there’s also the question of which is most comfortable to track your sleep, which device gives you the most reliable data, and ultimately, how you can ensure you track your sleep wherever you are.

Read more