Good Thinking

App uses a smartphone's camera to guide the blind to bus stops

Apps that use GPS coordinates alone may not get users close enough to the actual bus stop
Depositphotos
Apps that use GPS coordinates alone may not get users close enough to the actual bus stop
Depositphotos

While there are already apps that guide blind users to a bus stop's approximate GPS coordinates, those people may unknowingly end up standing too far away from the actual stop. A new app addresses that shortcoming, by letting the smartphone's camera in on the act.

Known as All_Aboard, the AI-based app was developed by a team of scientists at the Harvard-affiliated Massachusetts Eye and Ear treatment and research center. It's made to be used along with a third-party GPS-based navigational app such as Google Maps.

Users start by utilizing that other app to get to a bus stop's approximate location. They then open All_Aboard and proceed to hold their smartphone up so that its rear camera can "see" the surrounding street.

Drawing upon a deep learning neural network which was trained on approximately 10,000 images of bus stops within that city, the app is reportedly able to visually identify the target stop's sign if it's within a range of up to 50 ft (15 m) away. Once the sign has been spotted, the app guides the user via sonar-like beeps which change pitch and speed as the person gets closer to the bus stop.

In field tests, 24 legally blind volunteers used both Google Maps and All_Aboard to locate a total of 20 bus stops – 10 in an urban setting (Boston) and 10 in a suburban environment (Newton, Massachusetts).

When it came to getting "close enough" to those stops, Google Maps alone had a success rate of just 52%, whereas All_Aboard boosted the number to 93%. Additionally, while the average distance between the map's endpoint and the actual bus stop was 6.62 meters (21.7 ft) with Google Maps, it was just 1.54 m (5 ft) with All_Aboard.

"Our findings suggest that the All_Aboard app could help travelers with visual impairments in navigation by accurately detecting the bus stop, and therefore greatly reducing their chance of missing buses due to standing too far from the bus stops," said Massachusetts Eye and Ear's Assoc. Prof. Gang Luo. "This study indicates that computer vision-based object recognition capabilities can be used in a complementary way and provide added benefit to purely mapping-based, macro-navigation services in real-world settings."

All_Aboard has so far been trained on 10 major cities around the world, and is available for use on iPhones via the App Store. It is described in a paper that was recently published in the journal Translational Vision Science & Technology, and is demonstrated in the video below.

Source: Massachusetts Eye and Ear

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
1 comment
windykites
Bus stops could let out a ping sound. 'legally blind volunteers' sounds a bit weird. No pretenders!