For obvious reasons, texting and email is a preferred method of communication for many deaf and hard of hearing mobile phone users. But as convenient as texting can be, it isn’t always the most reliable form of communication – messages can take a while to arrive and short messages can easily be misinterpreted. To address this problem University of Washington (UW) engineers are developing the first device able to transmit American Sign Language (ASL) over U.S. cellular networks.
The advent of mobile phones with larger displays and video calling would seem to be an obvious solution for people who communicate with sign language. People in countries where 3G networks are ubiquitous are already using mobile phones for sign language communication. However, in areas with low bandwidth even today’s best video encoders struggle to produce the video quality needed for intelligible sign language.
UPGRADE TO NEW ATLAS PLUS
More than 1,500 New Atlas Plus subscribers directly support our journalism, and get access to our premium ad-free site and email newsletter. Join them for just US$19 a year.UPGRADE
The MobileASL project at UW has been working to optimize compressed video signals for sign language. By increasing image quality around the face and hands, researchers have brought the data rate down to 30 kilobytes per second while still delivering intelligible sign language. MobileASL also uses motion detection to identify whether a person is signing or not, in order to extend the phones' battery life during video use.
The team says transmitting sign language as efficiently as possible increases affordability, improves reliability on slower networks and extends battery life, even on devices that might have the capacity to deliver higher quality video.
Sign of the timesNewly released high-end phones, such as the iPhone 4 and the HTC Evo, offer video conferencing. But users are already running into hitches – broadband companies have blocked the bandwidth-hogging video conferencing from their networks, and are rolling out tiered pricing plans that would charge more to heavy data users.
The UW team estimates that iPhone's FaceTime video conferencing service uses nearly 10 times the bandwidth of MobileASL. Even after the anticipated release of an iPhone app to transmit sign language, people would need to own an iPhone 4 and be in an area with very fast network speeds in order to use the service. The MobileASL system could be integrated with the iPhone 4, the HTC Evo, or any device that has a video camera on the same side as the screen.
"We want to deliver affordable, reliable ASL on as many devices as possible," Riskin said. "It's a question of equal access to mobile communication technology."
Field testingInitial field tests with participants in a UW summer program for deaf and hard-of-hearing students using phones imported a couple of years ago from Europe are just being completed. In the first two and a half weeks of the study, some 200 calls were made with an average call duration of a minute and a half, researchers said.
"We know these phones work in a lab setting, but conditions are different in people's everyday lives," said project leader Eve Riskin, a UW professor of electrical engineering. "The field study is an important step toward putting this technology into practice."
Most study participants say texting or e-mail is currently their preferred method for distance communication. Their experiences with the MobileASL phone are, in general, positive. A larger field study will begin this winter.View gallery - 2 images