By combining a wireless connected EEG headset from Emotiv and an assistive communication app, California-based Smartstones is bringing the power of speech to those who have difficulty communicating verbally. The "think to speak" technology works by reading the brainwaves of the user and expressing them as phrases spoken through the app.
:prose is the app at the heart of it all, developed by Smartstones to help nonverbal people communicate by tapping or swiping on a mobile device. Like sign language, individual gestures and movements are linked to words and phrases: for example, swiping up could mean "I want", and drawing a circular motion could mean "water". The app recognizes the input and speaks aloud the complete sentence, "I want water." The commands are customizable too, so a user can assign phrases to specific movements however they like.
Sick of Ads?
Join more than 500 New Atlas Plus subscribers who read our newsletter and website without ads.
It's just US$19 a year.More Information
It's a valuable tool for people who are living with conditions that present verbal communication challenges, such as ALS, Autism and Cerebral Palsy, or have suffered brain or spinal injuries. But for others, the movements required to use the app can also be difficult to perform, due to conditions like Parkinson's or ALS, which inhibit a person's motor skills. That's where the EEG headset comes in.
Using Emotiv's Epoc or Insight headset in conjunction with the :prose app, a user can simply think about the motions tied to each command, and the headset reads their brainwaves, transmits the signal to the app via Bluetooth, and speaks the related phrase aloud. It works the same way as gestures and touch interfaces, but removes the element of physical movement that some people may struggle with. Users can even send messages to other devices with the power of thought alone.
The system is reportedly really easy to learn, compared to previous pictogram-based approaches, and relatively inexpensive. Combine this with other technologies such as EyeControl or the cheek-movement speech synthesizer used by Stephen Hawking, and the future could be a liitle brighter for those "locked in" their own bodies.
Smartstones worked with nonprofit organization Pathpoint to test the technology, and the results are encouraging.
"One of our participants now can communicate by using mental commands," said Gil Trevino, Lead Direct Support Professional at PathPoint. "Within minutes she was speaking several phrases aloud, compared to years of training with other technologies. This advancement has allowed someone who once was a non-verbal communicator, the ability to communicate thoughts, feelings and answers in a way she never has before."
The :prose app is available for iOS, and Smartstones is running a beta on its "think to speak" system shortly. Anyone interested in participating can sign up now via the source link below.
The following video shows the system in action.