FastCompany Magazine

The official Tumblr of Fast Company.

Whaaaaaaaaaaaaaaaat.


Lenovo and Tobii today unveiled what they’re calling the “world’s first eye-controlled laptop,” and they’re giving it a public demo at the CeBIT show in Germany. The idea is to add what the makers call a “truly natural interface” into everyday computing situations, allowing users of the laptop to control what happens in the UI using keyboard, mouse and where they glance on the screen—not entirely unlike the hand-eye-coordinated actions we do in non-computing environments. This taps into the current trend of natural user interfaces, and could change how you think about PCs.
When you work on a computer, your hands are typically glued to a keyboard and mouse or trackpad to control what goes on on the screen—but your eyes scatter and dart around as you focus on specific details. This is something Net researchers have known how to exploit for a while, and Google even uses eye-tracking tech to work out where attention-grabbing hotspots are on its homepage so it can optimize its design to better position interface controls and (yes, of course) adverts.
Lenovo is exploiting Tobii tech in a much more interactive way. The system is at heart just an infrared light source and a camera that observers a user’s eyes—by looking for reflective IR “glints” off your eyeballs. It combines that with clever software that works out where the eyes are positioned in space in front of the computer, and where they’re looking—translating the interpreted gaze into a corresponding position on the screen.
The result is that you can glance at an on-screen object like an icon and the system will pop up more info on that item. Maps can be scrolled or zoomed depending on the area of interest that you’re concentrating on, and more subtle UI events can be worked in to improve your workflow. These include things like dimming windows you’re not looking at, switching focus between windows based on which ones you what to see, and darkening the display if you’re not sitting looking at the machine (a far better way of working out that you’re being “idle” than merely timing-out if you don’t touch the keyboard or mouse for a certain period). For an idea about how this may work, the video below is of a Tobii eye-track UI developed way back in 2007.



I don’t know what’s real anymore.

Whaaaaaaaaaaaaaaaat.

Lenovo and Tobii today unveiled what they’re calling the “world’s first eye-controlled laptop,” and they’re giving it a public demo at the CeBIT show in Germany. The idea is to add what the makers call a “truly natural interface” into everyday computing situations, allowing users of the laptop to control what happens in the UI using keyboard, mouse and where they glance on the screen—not entirely unlike the hand-eye-coordinated actions we do in non-computing environments. This taps into the current trend of natural user interfaces, and could change how you think about PCs.

When you work on a computer, your hands are typically glued to a keyboard and mouse or trackpad to control what goes on on the screen—but your eyes scatter and dart around as you focus on specific details. This is something Net researchers have known how to exploit for a while, and Google even uses eye-tracking tech to work out where attention-grabbing hotspots are on its homepage so it can optimize its design to better position interface controls and (yes, of course) adverts.

Lenovo is exploiting Tobii tech in a much more interactive way. The system is at heart just an infrared light source and a camera that observers a user’s eyes—by looking for reflective IR “glints” off your eyeballs. It combines that with clever software that works out where the eyes are positioned in space in front of the computer, and where they’re looking—translating the interpreted gaze into a corresponding position on the screen.

The result is that you can glance at an on-screen object like an icon and the system will pop up more info on that item. Maps can be scrolled or zoomed depending on the area of interest that you’re concentrating on, and more subtle UI events can be worked in to improve your workflow. These include things like dimming windows you’re not looking at, switching focus between windows based on which ones you what to see, and darkening the display if you’re not sitting looking at the machine (a far better way of working out that you’re being “idle” than merely timing-out if you don’t touch the keyboard or mouse for a certain period). For an idea about how this may work, the video below is of a Tobii eye-track UI developed way back in 2007.

I don’t know what’s real anymore.