Android Central @ CES

The Eye Tribe was at CES 2014 showing off their eye-sensing bar, which lets users navigate without tapping or clicking anything. This has been available for awhile for $99, but what was really interesting was the micro USB dongle they showing running on an Android device. This enabled gaze tracking on a few productivity apps, but most the most impressive demo it ran was Fruit Ninja, so you could play just by looking at different points on the screen.

The more serious and slightly foreboding application here is that tracking gaze can be used by advertisers to figure out which ads you look at and for how long. Obviously that kind of data would be hugely useful, especially to an ad giant like Google, but this is likely pushing the boundaries of what people are comfortable with. 

The infrared camera dongle isn't being sold, but is being used as a proof of concept for mobile, with the final goal of getting manufacturers to integrate the technology directly into devices. Developers and other early adopters can snag a PC sensor bar now for $99. It's also worth noting that it's difficult to go completely finger-free; your eyes are always on and always moving, often involuntarily, so touch is still useful for confirming selections, or activating eye tracking mode. 

Still catching up from the insanity of the show? Feel free to peruse our amazing #CESlive coverage hub!

 

Reader comments

The Eye Tribe gaze tracking enables finger-free Fruit Ninja on Android

20 Comments

Isn't Samsung *rumored* to put some eye technology in the Upcoming Galaxy S 5?

Posted via Android Central App on my Nexus 5 (4.4.2)

this "technology" has already been patented by Apple, Inc. back in July 2008... US patent... not sure about worldwide

unless they came up with doing the same thing using different designs, I don't see how Apple can just stand quietly by.

I'm currently trying to patent the time machine. It doesn't exist yet and I'll never invent it but since I made a claim to it, the technology is mine even though someone else did all the work.

2008? Lol. It's been in use since the 1980s and the theory goes back decades before that. I can remember watching documentaries on TV about that stuff.

All that's happened is the technology is now cheap enough to be used for consumer tech.

Sounds like one of those typical '(something already used) ON A PHONE!!' Buillshit patents.

Posted via Android Central App

Seems a bit creepy to me. I can see a few limited cases where it might be useful, but I haven't seen anything compelling yet as an end user.

*Comfort and ergonomics*

If you stare down at your device for too long, you could get a neck strain. If you put your device upright with a tablet holder, excessively reaching out to press can cause pain:

Gorilla arm syndrome: "failure to understand the ergonomics of vertically mounted touchscreens for prolonged use. By this proposition the human arm held in an unsupported horizontal position rapidly becomes fatigued and painful".

I have not seen any examples of a developer doing serious programming on a touchscreen. I’ve seen programmers that operate in a three-monitor environment, and I don't think that repeatedly reaching their arms across to touch the screens would be comfortable over time.

Eye control can be lower in physical exertion.

*Already using your eyes*

Before you move your mouse to select something, it is very likely that your eye gaze goes to the target first. The same thing goes for touch user interfaces. Your eyes are most likely already “touching” the interface widgets before you decide to actually reach out and physically touch them.

*Achieving different actions on a target: eye highlighting + touching virtual function buttons vs. touch gestures alone vs. mouse clicking on a desktop*

Eye highlighting + function buttons

If you had eye control on a touch device, you could have multiple go-to, base, function buttons (could be two or three) that you can press after you highlight something with your eyes.

Example: a video

E.g. You look at a video that you’re about to watch, and then you could press function button one to open and play it, press function two to preview a thumbnail-sized highlight reel of it, and function three could be set to do whatever other command you want, like go to the comments section.

Touch alone: multiple touch gestures for different actions

Currently, if I take something like the Chrome icon on the home screen of Android, I can tap it to open it, or long-press/hold it to move it. (There's also double tap, triple tap, and swiping that are available for use, but I think it ends there).

Desktop: different types of mouse clicking for different actions

For desktop users, left and right single click, left and right double-click, left and right mouse drag, and the middle mouse click are some examples of mouse clicking that achieve different actions on a target once a mouse cursor is on it. More advanced mice have even more keys and buttons that can be reprogrammed, as some people need more.

Advantages of eye tracking + function buttons

Single tapping function keys would probably be faster and more comfortable than repeatedly doing double clicks, double taps, or a multi-finger gesture, such as pinching and zooming.

Since you may only need a few activation buttons, your thumbs or fingers reach out for fewer things. If it’s a larger, tablet-sized screen, which requires more hand movement to reach all the buttons, then assigning yourself to merely a few buttons and positions will give you even more of a speed and comfort advantage.

At 13:30 of the video of Eye Tribe’s presentation at Techcrunch’s Hardware Battlefield, Sune Johansen pitches a future application of augmented reality in glasses to Martha Stewart and the rest of the judges. You could use your eyes to manipulate the augmented reality projections. Another example is where a sensor detects that you’re looking at a person, and an online profile pops up (by the way, that would definitely help with breaking the ice, and initiating a conversation).

Notably, Google has been granted an eye tracking patent that involves recording advertisement impressions through eye glances with pay-per-gaze, and another patent that demonstrates a method to unlock a device by having a head-mounted accessory track the patterns of the pupil, so while it’s just speculation, it isn’t unreasonable to think that Google could soon be throwing its weight behind eye tracking, especially if you factor in the projections that have Google Glass being shipped in 2014.