4

This neat eye-tracking experiment looks like a weirdly convenient way to interac...

 2 years ago
source link: https://www.androidpolice.com/eyemu-eye-tracking-phone-interation-experiment/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

This neat eye-tracking experiment looks like a weirdly convenient way to interact with your phone

By Will Sattelberg Published 1 day ago

It's perfectly paired with small motion gestures

Although a handful of small phones still exist, it's no secret the entire industry has moved into a "bigger is always better" mindset. A large display can come in handy when you're editing documents on the go or watching a downloaded movie on a plane, but you'll need two hands to tap on every part of the screen. Researchers at Carnegie Mellon University's Future Interfaces Group have created a potential solution to this ever-growing problem.

Unlike most accessibility projects, this particular experiment started with the goal of making smartphone control easier for everyone. According to a press release from CMU's Computer Science department (via TechCrunch), the team began by asking if there was "a more natural mechanism to use to interact with the phone." Obviously, your finger makes the most sense — you're directly interacting with icons and control elements on-screen. Every additional input device we've seen over the years, like styluses, has worked in the same fashion.

For the Future Interfaces Group, the team decided to work on eye-tracking instead. It's not a new experiment, but judging by the hands-on video published last fall, it's one of the most successful yet. With EyeMU, the user can select and open notifications, return to previous apps, and even select specific photos. These motions are paired with flicks of the phone itself, shaking it left and right or raising it closer or farther away from your face. It's better seen in action than described, so check out the clip below for a full demo.

The biggest difference between this approach and older examples of the same technology comes down to restraint. The team knew that you can't have actions paired with every single glance — otherwise, how do you get anything done? That's where the motion sensors come in, acting as confirmation prompts every time you need to select, move, or dismiss.

Obviously, this experiment is still very much a demo of what could come down the line in the future. Who knows — maybe the Pixel 11 will be powered by similar technology.

About The Author
614b5a576d027-will_sattleberg.jpeg?fit=crop&w=100&h=100

Will Sattelberg (974 Articles Published)

Will has been an Android enthusiast since he got his first smartphone in 2011. He loves watching movies, has a never-ending backlog of video games, and produces podcasts in his spare time. He lives in Buffalo, NY and is willing to give you chicken wing recommendations at any time. Just ask.

More From Will Sattelberg


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK