Researchers turn Kinect into a yoga instructor for the visually impairedView gallery - 3 images
Conventional yoga classes with an instructor up front demonstrating positions to the class aren't generally a viable option for the visually impaired, but a team of computer scientists from the University of Washington (UW) is set to open this healthy activity up to such users with the help of Microsoft's Kinect.
The UW team, led by doctoral student Kyle Rector, has developed a program called Eyes-Free Yoga that uses the Kinect's skeletal tracking capabilities to read a user's body angles and then provide auditory feedback on how to perform poses in real time.
Sick of Ads?
More than 700 New Atlas Plus subscribers read our newsletter and website without ads.
Join them for just US$19 a year.More Information
The system calculates the required angles using simple geometry and the law of cosines based on data of the user's body collected by the Kinect's cameras. It then instructs them on how to reach the desired pose, starting with the user's core, suggesting alignment changes if necessary, then progressing to instructions for the head and neck before finally giving directions for the arms and legs.
The system can provide around 30 different commands based on a dozen rules essential for each of the six poses the system is capable of providing instructions for, which include Warrior I and I, Tree and Chair poses. The instructions are in simple language, such as "Rotate your shoulders left,"or "Lean sideways toward your left." The system also provides positive feedback when the correct pose is achieved.
While practicing yoga herself as she developed the technology, Rector tested the program with 16 blind and low-vision people around to Washington who provided feedback. Although some of the test subjects had previously practiced yoga regularly, several had never done yoga before. However, 13 of the 16 said they would recommend the program and nearly all would use it again.
Although Rector says that the Kinect does have some limitations in terms of the level of detail with which it tracks movement, she chose it because of its open source software and wide market availability.
"I see this as a good way of helping people who may not know much about yoga to try something on their own and feel comfortable and confident doing it,” says Julie Kientz, a UW assistant professor in Human Centered Design & Engineering who collaborated with Rector on the system along with research assistant Cynthia Bennett. “We hope this acts as a gateway to encouraging people with visual impairments to try exercise on a broader scale," Kientz added. To that end, the team plans to make the program available for download online for use with a PC and Kinect.
Details of the system's development and testing is detailed in a paper (PDF) published in the conference proceedings of the Association for Computing Machinery’s SIGACCESS International Conference on Computers and Accessibility in Bellevue, Washington.
Eyes-Free Yoga is demonstrated in the following video.
Source: University of Washington