Touch-based directional devices let users feel directions
In-car navigation systems that literally tell drivers where to go are much more convenient and safer than resting a street directory on one’s lap and quickly trying to devise a route on a map at a set of traffic lights. But audio instructions may not always be the best way to impart directional information to hard of hearing drivers or those yakking on a mobile phone – with a hands-free kit I should hope. A new study suggests that devices mounted to a steering wheel that pull the driver’s index fingertips left or right could help motorists drive more safely. The same technology could also be attached to a cane to provide directional cues to blind pedestrians.
The study, carried out by researchers at the University of Utah, was based on a “multiple resource model” of how people process information, in which resources are senses such as vision, hearing and touch that provide information to the brain.
"You can only process so much," says the study's lead author, William Provancher, an assistant professor of mechanical engineering at the University of Utah. "The theory is that if you provide information through different channels, you can provide more total information. Our sense of touch is currently an unexplored means of communication in the car."
The study was conducted on a driving simulator with two devices attached to the steering wheel so that one came in contact with the index finger on each of the driver’s hands. During driving, each index fingertip rested on a red TrackPoint cap from an IBM ThinkPad computer that gently tugged the skin of the fingertips in the appropriate direction when approaching a turn.
For two scenarios without mobile phones, results showed that driver’s accuracy in correctly moving left or right was nearly identical for those who received tactile directions through their fingertips (97.2 percent) or by computerized voice (97.6 percent). Interestingly, drivers talking on a mobile phone were accurate 98 percent of the time when receiving fingertip navigation directions, but this dropped to 74 percent when relying on audio cues. The study "doesn't mean it's safe to drive and talk on the cell phone," says co-author David Strayer, a professor of psychology at the University of Utah. "It was a test to show that even in situations where you are distracted by a cell phone, we can still communicate directional information to the driver via the fingertips even though they are 'blind' to everything else."
Provancher points out that motorists already get some feedback through touch while driving: vibration from missing a gear while shifting or a shimmying steering wheel due to tire problems. Some carmakers also already use some tactile systems to warn of a car drifting out of a lane or to monitor blind spots, but these devices generally twist the steering wheel through an assisted steering system, rather than simply prompting the driver to do so.
Provancher has patents and hopes to commercialize his tactile feedback devices for steering wheel and other uses. He has already had preliminary talks with three automakers and a European original equipment manufacturer and says if any wanted to move forward with the idea such devices could be in production cars in three to five years.
In addition, Provancher says he is "starting to meet with the Utah Division of Services for the Blind and Visually Impaired to better understand how our technology could help those with vision impairments. It could be used in a walking cane for the blind," with a moving button on the handle providing tactile navigation cues to help the person walk to the corner market, for example.As well as applications for the vision- and hearing-impaired, Provancher says the technology could be used in a handheld device to let people feel fingertip-stretch pulses - rather than hear clicks - as they scroll through an iPod music playlist. He also says it might be used as a new way to interact with an MP3 music player in a vehicle, or to control games.
The findings of the study are being presented in San Francisco today at the Human Factors and Ergonomics Society’s 54th annual meeting.