Good Thinking

Eye-tracking video tech shows how people really see the world

Eye-tracking video tech shows ...
The green dot (middle) indicates where within the shot the person is looking
The green dot (middle) indicates where within the shot the person is looking
View 3 Images
Asst. Prof. Mark Lescroart helps graduate student Chris Sinnott calibrate the glasses' targeting mechanism
1/3
Asst. Prof. Mark Lescroart helps graduate student Chris Sinnott calibrate the glasses' targeting mechanism
The green dot (middle) indicates where within the shot the person is looking
2/3
The green dot (middle) indicates where within the shot the person is looking
Asst. Prof. Mark Lescroart (left) and Asst. Prof. Paul MacNeilage, with two versions of the headset
3/3
Asst. Prof. Mark Lescroart (left) and Asst. Prof. Paul MacNeilage, with two versions of the headset
View gallery - 3 images

Video-camera-equipped glasses may show you what the wearer's head is pointing at, but they certainly don't indicate what the person's gaze is actually fixed upon. A new headset is designed to do just that, however, and it could be used to advance a number of technologies.

Developed by a team at the University of Nevada - Reno, the system is based around a set of glasses manufactured by Germany's Pupil Labs. That eyewear, which is already commercially available, utilizes two backward-facing cameras to track the user's eye movements. It also features an inertial measurement unit (IMU), which is a motion sensor that combines an accelerometer and a gyroscope.

Led by Asst. Prof. Paul MacNeilage, the U Nevada researchers added two forward-facing cameras, that record the view looking ahead from the user's face. Instead of weighing the glasses down and adding to their bulk by incorporating a microprocessor, the team chose to instead link the glasses to a laptop that's carried in a backpack.

The combined setup not only records the forward view and the eye movements, but it also tracks the wearer's GPS coordinates, their head movements, and their body movement through three-dimensional space. After a test subject has gone for a walk while wearing the system, the resulting video footage consists of the output of the forward-facing cameras, with an ever-moving green dot indicating what their eyes were focussed upon at any given time.

By contrast, while there are other systems that track people's eye movements in response to projected visual stimuli, they're typically stationary lab-based setups in which the person's head is set rigidly in place against a chin rest.

Asst. Prof. Mark Lescroart (left) and Asst. Prof. Paul MacNeilage, with two versions of the headset
Asst. Prof. Mark Lescroart (left) and Asst. Prof. Paul MacNeilage, with two versions of the headset

Plans now call for four different labs to each be given five of the new headsets. The devices will subsequently be worn by volunteers ranging in age from five to 70 years old, as they stroll through places such as museums or libraries, or perform activities like shopping, cycling or commuting.

All told, over 240 hours of first-person video will be gathered and stored on servers located at the university. That "Visual Experience Database" will be freely accessible to scientists from other institutions, who wish to learn more about how people perceive the world around them. Ultimately, it is hoped that the technology could lead to advancements in fields such as neuroscience, vision science, cognitive science, artificial intelligence, or even art.

"We’re building a first-person view video database to provide visual data that is more in line with human experience," says MacNeilage. "We aim to find out what does it look like when people walk and move their heads to navigate through the world. We use a simple paradigm, find out how people sample the visual environment with their eyes."

An example of the video can be seen below.

First-Person Visual Database video-data gathering

Source: University of Nevada

View gallery - 3 images
2 comments
byrneheart
The first use will be by advertising companies to help them occupy our visual fields. When they know where we look, they know where to put. I'm not saying other useful applications of the information won't develop, just that this will be first, and very annoying.
Pierre Lafitte
I'd be embarrassed if anyone saw a video of where my eyes have been looking.