Health & Wellbeing

Face-sticker sensor could allow ALS patients to communicate

Face-sticker sensor could allow ALS patients to communicate
The device (pictured here applied to a beaker) should cost no more than about $10
The device (pictured here applied to a beaker) should cost no more than about $10
View 1 Image
The device (pictured here applied to a beaker) should cost no more than about $10
1/1
The device (pictured here applied to a beaker) should cost no more than about $10

Amyotrophic lateral sclerosis (ALS) causes people to lose control of their muscles – unfortunately, this often eventually causes them to lose their ability to speak. A new skin-worn device, however, could still let them communicate with others.

Because ALS often also affects the sufferer's limbs, writing or typing messages isn't always an option.

Some people instead utilize setups that measure the electrical activity of the nerves that control their facial muscles. In this way, subtle cheek-twitches or other intentional facial movements can be used to relay simple preprogrammed messages.

Such systems, however, tend to be relatively cumbersome – according to researchers at MIT, they're also not highly accurate at identifying specific movements. Seeking a less obtrusive, more reliable alternative, the scientists created a flexible, stretchable, inexpensive device that is temporarily applied to the facial skin. It's not highly noticeable, and can reportedly be made almost invisible by applying a coating of makeup.

Called cFaCES (conformable Facial Code Extrapolation Sensor), the tool consists of four piezoelectric sensors made of aluminum nitride, which are embedded in a thin silicone film. As the user smiles, twitches their cheek or performs other facial movements, the accompanying deformation of their skin places pressure on one or more of the sensors.

These respond by producing an electrical current, which is measured by an accompanying handheld processing unit. Different current strengths are associated with different movements, which are in turn used to convey different messages. These could include simple phrases like "I love you" or "I'm hungry." Alternatively, a series of movements could be combined with one another for more personalized, detailed messages.

In tests conducted on two ALS patients, the technology was shown to be about 75-percent accurate at distinguishing between three facial expressions: smiling, pursed lips, and open mouth. Both the accuracy and the number of identifiable expressions should increase as the system is developed further.

It is estimated that once commercialized, the device could cost as little as US$10.

cFacES is demonstrated in the following video. A paper on the research, which is being led by Asst. Prof. Canan Dagdeviren, was recently published in the journal Nature Biomedical Engineering.

Source: MIT

conformable Facial Code Extrapolation Sensor (cFaCES)

No comments
0 comments
There are no comments. Be the first!