We've already seen eye-tracking systems being used to control things like laptops and TVs, but ... cars? Well, the Visteon Corporation isn't suggesting that we use our eyes to steer our cars. At least, not yet. Its HMeye cockpit concept, however, is designed to show how such technology could be used to help drivers keep their attention on the road.
The demonstrator cockpit combines multiple cameras with controls on the yoke-style steering wheel. Those cameras track both the direction of the driver's gaze, and the position of their head. This allows the computer running the system to determine what part of the vehicle's LCD instrument panel they're looking at.
While some functions (such as switching between displays) are managed via the button controls on the steering wheel, others are based purely on gaze – these include things like the radio, climate control and navigation systems.
HMeye is a play on the widely-used acronym HMI, which stands for human-machine interaction.
The idea behind it is that since drivers won't have to physically reach out and touch the console, they won't have to take their hands off the wheel, or look away from the road for as long. Given that it still does rely on them looking down, however, it's hard to say just how much safer it actually might be – particularly when compared to voice-control technology.
Other systems – such as Fraunhofer's Eyetracker – also combine eye-tracking with driving, although they do so in order to detect driver fatigue or inattention.
You can see the HMeye cockpit in use, in the video below.
Want a cleaner, faster loading and ad free reading experience?
Try New Atlas Plus. Learn more