Skip to main content

Death of the mouse: How eye-tracking technology could save the PC

Eye Tracking
Image used with permission by copyright holder

In the last few years, touch control has revolutionized the way we interact with mobile devices. The technology has been so popular on smartphones that Apple used its proven touch approach to reinvent the dead tablet market with the iPad. Thanks to the blooming growth of these devices, touch is taking off in a big way, taking on new form factors and posing a potential threat to our oldest friend: the PC.

With all of these motion-controlled interfaces for video game systems and touch interfaces for mobile devices, the PC with its keyboard and mouse, just feels, well, old. The keyboard is still the fastest and best way to enter large amounts of data and to author written content, but the mouse and touchpad are a step removed from the natural, direct feeling one gets when using the Wii, Xbox Kinect, or a touch tablet. Tobii hopes to rectify this imbalance.

Last Friday, I met up with Barbara Barclay, North American manager of Tobii Technologies (a Swedish company) to try out a completely new type of user interface built for consumer desktops and laptops. In a small office building in Manhattan, New York, she let me try out one of only 20 prototype Lenovo laptops, which each have built-in infrared sensors that track eye movement so precisely and quickly that it makes even the best mouse interfaces feel antiquated.

Here’s how it works

Lenovo-Tobii eye control PCBefore we began the demo, Barbara explained the technology. Tobii’s eye control works a bit like the Xbox Kinect (or a reverse Wii), but on a much closer scale. As you sit in front of the laptop, a row of two synced infrared sensors located under the screen scan your eyes. They do this about 30 to 40 times per second, examining the size and angle of your pupil, the glint in each of your eyes, and the distance between you and the laptop. Together, the two sensors create a stereoscopic 3D image of your eye for the computer to examine. Based on the angle and glint of your eye, Tobii’s technology calculates precisely which part of the screen you are looking at. It can even tell when you look away or close your eyes. To save power, the demo unit on hand darkened its screen when we looked away.

After explaining how it works, Barbara calibrated the Lenovo-Tobii eye control PC for her eyes. The calibration process takes a few seconds. Basically, you look at a series of three to nine dots on the screen, which lets the computer know where your eyes are looking. Nintendo has used similar calibration tests on its Wii Motion Plus controllers and Wii Fit balance board software. The calibration is painless and shouldn’t have to be done very often. After calibration, the laptop will be able to save your “eye profile” and know how to calibrate when a familiar user logs on.

It actually works…really well

Next, we ran through several hand-crafted demos that show off some different use scenarios for eye control. Even after I tried it out myself, I had a hard time believing the demonstration wasn’t an elaborate ruse. However, after a few moments, I began to believe.

Almost instantly, the computer began to pick up my eye movements and respond to whatever I looked at. When I looked at an item, it would highlight itself and come to the forefront. When I looked at a map, it knew exactly which area I was staring at, right down to the pixel. Much like how a novice with the Wii will wave the Wii Remote wildly, my first instinct was to move my entire head as I looked at different items on the screen. This worked well enough, but after a few moments, I learned that Tobii’s technology could pick up the subtlest of eye movements without the aid of my head moving. Somehow it could tell when I moved my eyes a half inch to the left or right or casually looked up or down, even a hair.

Lenovo-Tobii eye control PC
Image used with permission by copyright holder

The fluidity of the experience reminded me of the first time I used the iPod Touch, and how natural it felt to swipe and touch precisely where I wanted. Before the Touch and iPhone, most touchscreens used resistive touch technology, which required you to actually press down on the screen. These screens demanded a stylus (pen-like device) to achieve precision, but Apple changed the game with its more natural interface that let you directly use your fingers. Tobii’s eye control technology is as direct as any touch interface. It feels like touch from afar.

Uses for eye control

The first portion of the demo (which you can watch below) simply shows where your eyes are looking on the screen. Looking at your own eyes isn’t particularly fun, but it shows you how fast the system reacts when you move or blink. However, after the intro screen and calibration, we got into some different use scenarios.

Reading: Out of all of the uses for eye control, reading demonstrates its value more than anything else. Everyone has their own technique and style for reading on a laptop or touch device. Personally, I tend to keep my text toward the top of the screen. Sometimes I use the mouse to highlight things I’ve already read and use the direction buttons to scroll down. Tobii’s eye control instantly makes all of these customized reading styles irrelevant. More natural than book reading, text automatically scrolls up, down, left, or right for you as your eyes pan around the screen. It’s amazing. Confused about a word? Well, if you stare at one word long enough its definition will pop up.

Playing media: Another demo Barbara showed me was a simple media player. A row of pictures and album covers  fill the bottom of the screen. Glancing at one of them highlights the choice and looking upward plays the music or maximizes the picture so you could get a better look. Done listening or viewing? Simply look at another item in the list. And when you looked at the arrows on the left or right for a second or so, the next page of results appear.

Zooming and panning: Eye control doesn’t mean there is no use for the keyboard. In a Google Maps-like demo, you can pan and zoom by looking and pressing/holding a button, which works well. A single button is assigned to zoom and another button is assigned to pan. To zoom, you simply look at what you wish to focus on and push the zoom button. Once zoomed, holding another button and looking left, right, up, or down lets you pan around your zoomed image. There are likely even more intuitive ways to perform complex tasks like this.

Multitasking: At CES this year, I complimented the BlackBerry PlayBook for its easy swiping method to switch between applications. With a WebOS-like interface (or perhaps WebOS itself) and Tobii’s eye control, multitasking between apps is as natural as looking to the left or right of the screen. Using Windows 7, which isn’t at all optimized for anything other than a mouse, Barbara swapped between windows by looking at them and pressing a button. She also moved a mouse pointer icon around the screen with ease.

Gaming: The last demo we played was a simple Asteroids-like game. Your mission is to protect the Earth from a barrage of doomsday-sized comets and asteroids headed your way. Looking at an asteroid triggers a laser that blows it up. There are a ton of touch-type games like this, which pit your reflexes against the computer, but eye control takes the speed and intensity of these games up a notch. I’m incredibly excited to see what kind of games can be made using the speed of the eye.

These are only a few of many new ways eye control would let you interact with a PC desktop or laptop. When you start thinking about how this technology could interact with voice recognition, the possibilities seem endless. In a few years, Minority Report may look dated.

It’s like Kinect for PCs

Lenovo-Tobii eye control PC
Image used with permission by copyright holder

Microsoft has said that it will eventually release Kinect-like motion technology for PCs. Well, I hate to break it to them, but Tobii has done a lot of the hard work. Moving Kinect to a PC would mean shifting the focus from the body to the eyes and face, something Tobii has achieved with remarkable precision.

I’ve written a lot about Microsoft and the many challenges it faces with Windows 8. Currently, the company has a failing Phone platform and no tablet strategy. To forge ahead, Microsoft needs to generate excitement around its bread and butter, which is still traditional, keyboarded PCs. Tobii’s eye control is exactly the kind of innovative interface Microsoft could build a comprehensive new experience around. It could simplify the complexities of the Windows desktop OS while bridging some of the gaps between the PC and mobile touch platforms. If implemented properly, technology like this could reignite some buzz around the laptop market, especially combined with many of the ideas Microsoft has demonstrated on Windows Phone and Xbox Kinect.

Barbara informed me that Microsoft is already a client of Tobii’s for some of its larger research units, which the Redmond giant uses to study the effectiveness of its own application layouts and interfaces by tracking the eye movements of potential customers as they experience a new design. Microsoft, I’m looking at you. If you don’t try something like this, someone else will.

Regardless of who it is, one thing is clear: Tobii needs a strong partner that realizes the potential of its technology.

Eye control is coming, hopefully

Like any good technology, Tobii’s eye control is only as useful as the software developers who write for it. The company is in talks with a number of hardware, software, and platform makers to work toward implementing its technology in PCs as soon as two years from now, but it will be a tough road ahead. The technology still needs to be smaller, use less battery, and cost less. Without the vision of Apple, touch tablet devices stagnated in limbo for a decade. Lets hope that one of these platform makers recognizes the potential in eye control. The only loser here is the mouse, and I’m sorry my friend, I love ya, but your days are numbered.

Jeffrey Van Camp
Former Digital Trends Contributor
As DT's Deputy Editor, Jeff helps oversee editorial operations at Digital Trends. Previously, he ran the site's…
Final Fantasy 7 Rebirth proves, once again, that 8GB GPUs are on their way out
Final Fantasy VII Rebirth running on the Steam Deck.

Final Fantasy 7 Rebirth is headed to PC in a few short weeks, and ahead of the release, Square Enix has released the PC requirements for the game. There are a couple of interesting specs, but one stands out in particular. Even some of the best graphics cards, particularly those packing 8GB of VRAM, might struggle to run the game.

You can see the full system requirements below. At the bottom of the list for each of the configurations, there's a note about VRAM capacity. For 1080p and 1440p, the requirements call for a GPU with at least 12GB of video memory when used with a 4K monitor, while at proper 4K, the requirements call for a GPU with 16GB of memory.

Read more
Don’t get your hopes up for next-gen GPUs just yet
Two RTX 4060 graphics cards stacked on top of each other.

The list of the best graphics cards will probably look a lot different in a month's time. We're standing on the edge of the next generation of graphics cards, and it looks like Nvidia, AMD, and Intel all have big plans in store. At least from the conversations I've had, all eyes are on what the next generation of graphics cards has to offer before making an upgrade decision.

That's generally good advice -- if new hardware is about to launch, there isn't much reason to spend up for last-gen components. You'll likely pay a higher price, and you could be missing out on some big performance gains. This generation, however, it's important to temper expectations. Although the next generation of graphics cards is exciting, it probably won't be a reality for most gamers anytime soon.
Always start with the flagships

Read more
Yes, it’s real: ChatGPT has its own 800 number
1-800-chatgpt

On the 10th of its "12 Days of OpenAI" media event, the company announced that it has set up an 800 number (1-800-ChatGPT, of course) where anyone in the U.S. with a phone line can dial in and speak with the AI via Advanced Voice Mode. Because why not.

“[The goal of] OpenAI is to make artificial general intelligence beneficial to all of humanity, and part of that is making it as accessible as possible to as many people as we can,” the company's chief product officer, Kevin Weil, said during the Wednesday live stream. “Today, we’re taking the next step and bringing ChatGPT to your telephone.”

Read more