Purchasing new hardware? Read our latest product comparisons

Eyes-on with Tobii's eye-tracking technology


January 20, 2014

Gizmag met up with Tobii at CES 2014 to see first-hand how the company's eye-tracking technology could change how we interact with computers and play video games

Gizmag met up with Tobii at CES 2014 to see first-hand how the company's eye-tracking technology could change how we interact with computers and play video games

Image Gallery (18 images)

We've been following the development of Tobii's impressive eye-tracking technology for several years now, but it looks like consumers may actually get a chance to try it out for themselves in the not too distant future. Thanks to a partnership with SteelSeries, an eye tracker specifically for gamers is set to hit the market later this year, but that still leaves the question of what the technology can actually bring to video games. Fortunately, we were able to catch up with Tobii CEO Henrik Eskilsson on the CES show floor and try it out for ourselves.

Practicing in Windows 8

For my first foray with the eye tracker, Eskilsson set me up on a prototype laptop with the technology built into it, as opposed to the current dev kits, which are stuck right below a computer monitor with an adhesive. Before I could use it though, I first needed to go through a quick 30-second calibration process, which involved getting into a comfortable position and following a small white circle with my pupils as it moved around a black screen. Afterwards, the system let me indicate if I wore glasses or contacts and then saved my profile in case we switched users later. According to Eskilsson, you should most likely only need to do this once for each person who uses it.

To make sure the calibration was correct, he brought up the Start page in Windows 8 and had me flit my eyes over various tiles. As I glanced over them, each tile would highlight as I focused my gaze on it. Hitting a pre-set hot key on the keyboard would then open whatever I was looking at much like a mouse button, meaning I could basically navigate the entire screen with only one finger.

Surprisingly, the eye tracker was fairly precise with my selections and was quick to keep up with the movements of my eyes. Because you're practically mimicking the actions of using a mouse, minus the on-screen cursor, selecting items on the screen feels very natural. The main hurdle for me was resisting the urge to grab the mouse and keyboard out of habit.

Just for fun, I opened up the Windows calculator app and started punching in various equations just to see how fast I could do it. Using the eye tracker proved to be just as fast, if not faster, than using the usual mouse. It sounds innocuous just describing it, but even this simple task was a little mind-boggling, since for a brief moment it felt like the computer was reading my mind. Rather than entering the equations manually, it almost seemed like the computer was filling them in for me as I thought about them.

As a further demonstration of eye tracker's accuracy, Eskilsson opened up the Maps program, zoomed it out to show the whole of the United States, and invited me to pinpoint my hometown. So I looked in its general area and double-tapped the hot key, which focused the screen on my home state. A few more double-taps and I was eventually staring at my county, then my town, and then my block, with my house squarely in the center. I was also briefly shown a PDF document, which I could scroll through by holding the key down and looking at the top or bottom of the screen.

Essentially, in Windows 8, the eye tracker allows for a wide range of basic actions using just a couple of buttons and your eyes. It works incredibly well for simple tasks, but more complex actions, like clicking and dragging, still require a mouse. Eskilsson claims to use the eye tracker for many of his day-to-day tasks, and I can see why. If I had the option, I'd browse the internet and check my email with only one finger too.

Game time

Carrying out some typical computer functions with just my eyes was impressive on its own, but I was really curious to see how how well Tobii's technology could be applied to video games, given its recent partnership with SteelSeries.

So far, eye tracking has only been incorporated into four games – Starcraft 2, World of Warcraft, Civilization V, and Deus Ex: Human Revolution – but only WoW and Starcraft 2 were available on the show floor. As Eskilsson explained, Tobii's software is designed to run in tandem with other programs, so the original code for a game doesn't need to be altered at all. Theoretically, eye-tracking features could be added to any game, past, present, or future, by a dedicated programmer.

First up for me to try was WoW, which I haven't played much, but was pretty easy to pick up for the purposes of this demo. The eye tracker comes into play mainly to navigate the world of Azeroth without using your hands at all. Once you toggle the auto-run on with a specified key, your avatar will take off in whatever direction you're looking at, allowing you to steer it with just your eyes. To turn around, you just look toward the edge of the screen and the character will move in an arc until it's facing the other way. Using a different hot key brings up a large overlaying menu with various icons representing your inventory, map, quests, etc. Looking at one while you release the key will open or close that particular item, so you can quickly check it while on the go. You can also attack or launch spells by hitting one of the game's hot keys while looking at your target.

It took a little while for me to get the hang of it (I ended up wedged between two rocks the first time I tried moving), but eventually I could guide myself towards a monster, kill it, and move on to another one without much effort. I'd be lying if I said I didn't feel compelled to pick up the mouse at times though, since that's how I've played computer games for over 20 years now. The eye tracker may come in handy for lengthy grinding sessions, but more complicated quests and micro-managing will still require a mouse.

Next up was Starcraft 2, which was much easier to control with the eye tracker, since I'm more familiar with the game. Hot keys for nearly every in-game action are included, so you can essentially play the whole game one-handed, with your direct line of sight acting as a cursor. By looking at any point on the map, you can select units and tell them where to go, what enemies to attack, what areas to patrol, and so on. The one thing lacking however is the ability to select multiple units of different types at once, though this could presumably be fixed after setting up a few groups ahead of time with mouse.

Rather than scrolling through the level or clicking a spot on the mini-map in the corner, you can press a pre-set key that expands the mini-map to fill the whole screen. As with the large menu in WoW, you then focus on the area you want to expand, release the key, and your view will immediately shift to whatever spot your eyes were trained on. It's not a monumental difference, but it is a bit faster than the usual methods.

Tobii has also applied its eye tracker to Deus Ex: Human Revolution to demonstrate its use with first-person controls, but the demo wasn't working well enough for public consumption just yet. Supposedly though, the game will allow for additional actions, such as closing one of your eyes to look down a gun sight. It will also add head tracking features, allowing you to lean out from behind an in-game wall just by moving your own head to the side.

Looking to the future

Despite all the eye tracking gadgets on display, Tobii is mainly interested in developing the technology to pass on to manufacturers, who can then bring it to consumers. In addition to the prototype laptop, the company also had an eye tracker-equipped prototype tablet and computer monitor on display, though they didn't have any interactive demos running on them. There's even a game studio already working on a title specifically for use with eye tracking called Son of Nor, which will have players reshaping the landscape and throwing objects at enemies with just their eye movements.

However, Eskilsson says the eye tracker isn't meant as a replacement for the mouse and keyboard so much as an alternative that could add a new dimension of control. He's hoping that breaking into the gaming market through SteelSeries will open up further avenues for the technology. For now though, Tobii is making the latest version of its eye tracker available so other developers can try out their own ideas for it, much like Oculus has been doing with the Rift for the past couple years.

If you want to check out the eye tracking tech for yourself, Tobii is currently offering the EyeX dev kit on its website for US$95 until the end of January, after which it will be $195. Otherwise, a SteelSeries' consumer model is due this summer.

Product Page: Tobii

About the Author
Jonathan Fincher Jonathan grew up in Norway, China, and Trinidad before graduating film school and becoming an online writer covering green technology, history and design, as well as contributing to video game news sites like Filefront and 1Up. He currently resides in Texas, where his passions include video games, comics, and boring people who don't want to talk about either of those things. All articles by Jonathan Fincher

Eye tracking in games is going to be most useful for managing stuff on the interface. For example, in WoW, it would be most useful in quickly activating abilities without having to memorize 20 hotkeys. I can also see it perfectly managing depth of field to where the player is looking.

Joel Detrow

There is a video of a demonstration at CES that shows a quick access menu for a first-person shooter; it makes eye tracking shine. You don’t have to wait to move the mouse cursor to any item before activating it, and you don’t have to let go of either of your hands: one hand and can remain on the keyboard for moving, and the other hand can remain on the mouse for aiming.

However, I like to see if eye tracking can help in a game like StarCraft. When they tracked the eye movements of pro-gamers, it seemed like their actions came first, and the gaze followed. That is, they don’t even look where they click a lot of the time.


Comfort I have not seen any examples of a developer doing serious programming on a touchscreen. I’ve seen programmers that operate in a three-monitor environment, and I don't think that repeatedly reaching their arms across to touch the screens would be comfortable over time.

Gorilla arm syndrome: "failure to understand the ergonomics of vertically mounted touchscreens for prolonged use. By this proposition the human arm held in an unsupported horizontal position rapidly becomes fatigued and painful".

Eye control can be much lower in physical exertion.

Augmentation, not replacement

Eye control can be an additional input that works together with your hands.

e.g. you can use a macro program like Autohotkey to remap a keyboard button to click.


Look at any interface widget to highlight it, and then touch the application key on the keyboard to left-click and select it.

Bringing speed and concept of virtual buttons like Android launcher icons, and Windows 8 tiles to desktop users

Lastly, after using Autohotkey for remapping, I soon didn't have enough keyboard buttons to attach macros and lines of code to them, so I'd have to make new scripts that use the same button. After more scripts, it can be easy to forget which button does what.

You can now optionally take away your hands for moving the mouse cursor. Instead, stare at a target on-screen button, and using a keyboard button to click, you can instantly invoke custom virtual buttons that have your macros and commands that are attached to them. Quick activation of on-screen interface elements without a touchscreen is now more feasible. It virtually turns a non-touch screen into a touchscreen. You could pretty much design the buttons and controls to look however you want. Customizable, virtual buttons are infinitely more productive than static physical keys.


e.g. I remapped F1 to launch a google search on whatever is on the clipboard: F1::Run google.com/search?hl=en&safe=off&q=%Clipboard% .

With another script, F1 could execute something completely different. And within that script, depending on the context, such as what program is currently running, or what window is in focus, the use of F1 could change again; it can get confusing.

It would be more intuitive to look at a virtual button that is actually labeled, "Google Search the Clipboard", and then tap my activation key.

Already using your eyes

Before you move your mouse to select something, it is very likely that your eye gaze goes to the target first. The same thing goes for touch user interfaces. Your eyes are most likely already “touching” the interface widgets before you decide to actually reach out and physically touch them.

Achieving different actions on a target: eye highlighting + touching virtual function buttons vs. touch gestures alone vs. mouse clicking on a desktop

Eye highlighting + function buttons

If you had eye control on a touch device, you could have multiple go-to, base, function buttons (could be two or three) that you can press after you highlight something with your eyes.

Example: a video

E.g. You look at a video that you’re about to watch, and then you could press function button one to open and play it, press function two to preview a thumbnail-sized highlight reel of it, and function three could be set to do whatever other command you want, like go to the comments section.

Touch alone: multiple touch gestures for different actions

Currently, if I take something like the Chrome icon on the home screen of Android, I can tap it to open it, or long-press/hold it to move it. (There's also double tap, triple tap, and swiping that are available for use, but I think it ends there).

Desktop: different types of mouse clicking for different actions

For desktop users, left and right single-click, left and right double-click, left and right mouse drag, and the middle mouse click are some examples of mouse clicking that achieve different actions on a target once a mouse cursor is on it. More advanced mice have even more keys and buttons that can be reprogrammed, as some people need more.

Advantages of eye tracking + function buttons: speed, comfort, and less hand movement.

Single tapping function keys would probably be faster and more comfortable than repeatedly doing double clicks, double taps, long presses/holds, or multi-finger gestures, such as pinching and zooming.

Since you may only need a few activation buttons, your thumbs or fingers reach out for .much fewer things. If you’re working with a larger, tablet-sized screen, which requires more hand movement to reach all the buttons and widgets, then assigning yourself to merely a few buttons and hand positions will give you even more of a speed and comfort advantage over those that don’t incorporate eye input.

Jeff Kang

I'd love to use this But not as an alternative to mouse/joystick ... as an addition In a gaming context, you need to be able to do different types of actions For example, the ability to move in one direction while looking in another Or to access interface information/actions, vs. in-world interaction

In traditional games your information is displayed all around the edges of the screen, and lots of things are going on within the main view, so your eyes need to be continually darting around ... the software will need to discriminate between 'looking' and 'commanding'

But do it right and this will change everything

Post a Comment

Login with your Gizmag account:

Related Articles
Looking for something? Search our articles