Smart cane packs parking sensor tech and beeps when collisions are imminent
We've seen a few different attempts to smarten up the walking cane over the years, including versions equipped with facial recognition to identify friends and others with laser rangefinders to detect nearby objects. The mySmartCane system is the latest modern-day take on this centuries-old mobility aid, consisting of a ball that can be retrofitted to existing canes to equip them with parking sensor-like technology and give users a better sense of their surroundings.
The mySmartCane was dreamt up by Vasileios Tsormpatzoudis, a PhD student at the University of Manchester after seeing his mother struggle with retinis pigmentosa, an inherited degenerative eye disease that breaks down the cells in the retina and causes severe vision impairment. The inventor also spoke with a number of white cane users, who left him with the impression that an emphasis on simplicity and affordability would be the best approach.
So he 3D-printed a ball that he says can be retrofitted to any white cane and packs ultrasonic sensors to wirelessly measure the distance to nearby objects. This is actually the same sensing technology used in the UltraCane that we covered more than 10 years ago, though the way it guides the user through the environment is a little bit different.
Rather than converting information into vibrations that can be felt in the handle, it translates it into an audio signal. Just like a parking sensor will beep more frequently the closer your bumper is to the car or object behind you, the system alerts users to nearby obstacles through the same approach. The sounds can be relayed via regular or bone-conducting headphones.
Tsormpatzoudis developed the experimental system as a research project, and it is very much in its early prototype phase, though he does have a few ideas about how to improve on the design. These include adding another sensor to detect overhead obstacles like sign-posts and doorways and possibly bringing vibrations into the mix as a way of guiding the user.
You can hear him speak about the project in the video below.
Source: University of Manchester