Robotics

Flashing LEDs facilitate brain-controlled exoskeleton

View 5 Images
Researchers have developed a brain-computer interface that would allow quadriplegics to control a lower limb exoskeleton by looking at specific LEDs
Korea University/TU Berlin
Researchers have developed a brain-computer interface that would allow quadriplegics to control a lower limb exoskeleton by looking at specific LEDs
Korea University/TU Berlin
The EEG brain control interface is wirelessly connected to the main computer control system
Korea University/TU Berlin
Staring at a specific LED generates different signals in the brain which are detected and interpreted to control the exoskeleton hands-free
Korea University/TU Berlin
The new exoskleton may one day offer hope for substantially paralyzed people to walk again
Korea University/TU Berlin
According to the researchers, volunteers only took a few minutes to learn how to operate the system
Korea University/TU Berlin
View gallery - 5 images

Lower limb exoskeletons show great promise in helping those who have lost the use of their legs towalk again. However, if a person has been rendered quadriplegic, any hand controls insuch a device are essentially useless. To help address this and other whole-of-body disabilities, scientists working at KoreaUniversity (KU) and Technische Universität Berlin (TU Berlin), have created a hands-free brain-to-computer interface to control a lower limbexoskeleton by specifically decoding signals from the wearer’s brain.

There are any number of exoskeleton units in development or in limited production today, and control of these is often achieved with detecting subtle upper body movements, such as in the ReWalk. Tooperate the KU/TU Berlin unit, however, the user initiates commands by staring at one of fiveflashing LEDs and an electroencephalogram (EEG) cap reads signals from the wearer’sbrain corresponding to the desired mode of movement. As each LED flashes at aspecific frequency, focusing on a particular one produces a specific signal inthe user’s brain. The system is then able to interpret the readings of thesesignals via the EEG cap and convert them into system instructions to operatethe exoskeleton.

This method of control makes it eminently suitable for eventhose with almost no capacity for voluntary body control (apart from eye movements), who would not be able to control a standard exoskeleton. The researchers claimtheir system also offers a much better signal-to-noise ratio by separating braincontrol signals from the surrounding noise of ordinary brain signals for moreaccurate exoskeleton operation than more conventional hard-wired systems.

Staring at a specific LED generates different signals in the brain which are detected and interpreted to control the exoskeleton hands-free
Korea University/TU Berlin

"Exoskeletonscreate lots of electrical 'noise'," said Professor Klaus Muller, of the TUBerlin Machine Learning Group. "The EEG signal gets buried under all thisnoise – but our system is able to separate not only the EEG signal, but thefrequency of the flickering LED within this signal. People with amyotrophiclateral sclerosis (ALS) [motor neuron disease or Lou Gehrig's disease], or high spinal cord injuriesface difficulties communicating or using their limbs. Decoding what they intendfrom their brain signals could offer means to communicate and walk again."

Accordingto the researchers, learning to operate the system was relatively simple andvolunteers required only a few minutes to get the hang of operating it. The only downsidesthey noted were that participants suffering from epilepsy were excluded due totheir possible reaction to the flashing LEDs, and operators suffered somedegree of "visual fatigue" after long-term use. The researchers took thesefactors on board and slated them for further research into reducing or negating theireffects.

"We were drivento assist disabled people, and our study shows that this brain controlinterface can easily and intuitively control an exoskeleton system – despitethe highly challenging artefacts from the exoskeleton itself," said ProfessorMuller.

The team's paper is published in the Journal of Neural Engineering and the technology can be seen in action in the video below.

Source: Institute of Physics

View gallery - 5 images
  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
0 comments
There are no comments. Be the first!