Skip to main content

Brain-reading tech lets paralyzed people control a tablet with their thoughts

BrainGate Collaboration

The U.S.-based BrainGate consortium has developed technology that makes it possible for people with paralysis to use tablets and other mobile devices — simply by thinking about cursor movements and clicks.

The technology uses a miniature sensor to record users’ neural activity via their motor cortex, the part of the brain used for planning, control, and execution of voluntary movements. These signals are then decoded and turned into instructions for controlling software. Using the system, three clinical trial participants were able to use a Google Nexus 9 tablet to carry out email messaging, chat, music streaming and video sharing. They also used the internet, checked the weather, and carried out online shopping, among other applications.

Recommended Videos

“For years, the BrainGate collaboration has been working to develop the neuroscience and neuroengineering know-how to enable people who have lost motor abilities to control external devices just by thinking about the movement of their own arm or hand,” Dr. Jaimie Henderson, a Stanford University neurosurgeon, said in a statement. “In this study, we’ve harnessed that know-how to restore people’s ability to control the exact same everyday technologies they were using before the onset of their illnesses. It was wonderful to see the participants express themselves or just find a song they want to hear.”

Please enable Javascript to view this content

What is particularly impressive about this demonstration is the speed at which these interactions could be carried out. Participants were able to make up to 22 point-and-click selections per minute, or type up to 30 characters during the same time frame. They also reported the experience feeling intuitive, with one person noting that, “It felt more natural than the times I remember using a mouse.”

This isn’t the first time we’ve covered amazing brain-computer interfaces, capable of letting people do everything from playing games of Tetris to controlling robot arms using only their thoughts. The more work that is done in this area, however, the closer we get to this technology being perfected and made available to everyone who needs it.

A paper describing this latest project, titled “Cortical control of a tablet computer by people with paralysis,” was recently published in the journal PLoS ONE.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Groundbreaking A.I. brain implant translates thoughts into spoken words
ibm-chip-human-brain-robot-overlord

Researchers from the University of California, San Francisco, have developed a brain implant which uses deep-learning artificial intelligence to transform thoughts into complete sentences. The technology could one day be used to help restore speech in patients who are unable to speak due to paralysis.

“The algorithm is a special kind of artificial neural network, inspired by work in machine translation,” Joseph Makin, one of the researchers involved in the project, told Digital Trends. “Their problem, like ours, is to transform a sequence of arbitrary length into a sequence of arbitrary length.”

Read more
Want to shake hands with the future? Check out this brain-controlled prosthetic
BrainCo Dexus prosthetic arm

This story is part of our continuing coverage of CES 2020, including tech and gadgets from the showroom floor.

There are some moments you just know that you’re staring the future in the face. The moment at CES that gave us that feeling more than any other? Shaking hands with an astonishingly lifelike artificial intelligence-aided prosthetic hand, controlled via the wearer’s brain waves and muscle signals. It felt solid, natural, and… well, pretty much like any other handshake, really.

Read more
Brain-reading headphones are here to give you telekinetic control
Neurable

For the past 45 years, SIGGRAPH, the renowned annual conference for all things computer graphics, has been a great place to look if you want a sneak peak at the future. In the 1980s, it was the place animation enthusiasts Ed Catmull and John Lasseter crossed paths for one of the first times. A decade later they had created Toy Story, the first feature length movie animated on computer. In the 1990s, it was home to a spectacular demo in which thousands of attendees used color-coded paddles to play a giant collaborative game of Pong. Today, online games played by millions are everywhere.

And, in 2017, it was where an early stage startup called Neurable and VR graphics company Estudiofuture demonstrated a game called The Awakening. In The Awakening, players donned a VR headset, along with head-mounted electrodes designed for reading their brain waves. Using machine learning technology to decode these messy brain signals, Neurable was able to turn thoughts into game actions. Players could select, pick up, and throw objects simply by thinking about it. No gamepads, controller, or body movement necessary. Is this the future of computer interaction as we know it?

Read more