Kia is looking to make future autonomous vehicles as pleasant as possible, with a number of mood-adaptive features as well as a new eye- and fingertip-tracking touch-free interface system launched at this year's CES in Vegas.
The R.E.A.D. (Real-time Emotion Adaptive Driving) system tracks passengers' state of mind by monitoring facial expressions, heartbeat and other biomarkers. Then it optimizes the cabin to influence you in the direction of a better mood, using lighting, audio, climate control, aromatherapy and even seat vibration.
Kia has chosen to demonstrate this tech in perhaps the least appealing way imaginable, with a video starring an emotionally unstable fish who gets stressed out by things like the sun, and birds looking at him the wrong way. Enjoy this masterpiece before moving on:
The other part of Kia's CES presentation focuses on virtual touch (V-Touch) interfaces that allow you to control car functionality using hand gestures that require no button pressing. Kia envisages V-Touch working through the use of 3D cameras that track the fingertips and eye motions of drivers and passengers.
It's said to allow you to point at the function you want to change – perhaps the air-con, perhaps the sunroof – then swipe your finger in mid-air to effect button-free control. How exactly gesture commands are to be separated from the hand motions typical of a gesticulating conversationalist remains to be seen, but we can see the average Italian passenger setting things off all over the place by accident.
While self-driving tech itself has remained tantalizingly out of reach so far – and has been described by some as the most complex problem humanity has ever tried to solve – many companies are still projecting that Level 4 and 5 autonomous cars are as close as five years away. They'll certainly change the way we get around, and it's interesting to see how companies believe they can make the interior of these hands-off transport machines more functional and pleasant. Just please don't treat us like sunfish.
Check out a V-Touch video below.
Source: Kia
Um, that's not reading your mind—it's reading your eye position and finger motions.
Sheesh.