Lower limb exoskeletons show great promise in helping those who have lost the use of their legs to walk again. However, if a person has been rendered quadriplegic, any hand controls in such a device are essentially useless. To help address this and other whole-of-body disabilities, scientists working at Korea University (KU) and Technische Universität Berlin (TU Berlin), have created a hands-free brain-to-computer interface to control a lower limb exoskeleton by specifically decoding signals from the wearer’s brain.
There are any number of exoskeleton units in development or in limited production today, and control of these is often achieved with detecting subtle upper body movements, such as in the ReWalk. To operate the KU/TU Berlin unit, however, the user initiates commands by staring at one of five flashing LEDs and an electroencephalogram (EEG) cap reads signals from the wearer’s brain corresponding to the desired mode of movement. As each LED flashes at a specific frequency, focusing on a particular one produces a specific signal in the user’s brain. The system is then able to interpret the readings of these signals via the EEG cap and convert them into system instructions to operate the exoskeleton.
This method of control makes it eminently suitable for even those with almost no capacity for voluntary body control (apart from eye movements), who would not be able to control a standard exoskeleton. The researchers claim their system also offers a much better signal-to-noise ratio by separating brain control signals from the surrounding noise of ordinary brain signals for more accurate exoskeleton operation than more conventional hard-wired systems.
"Exoskeletons create lots of electrical 'noise'," said Professor Klaus Muller, of the TU Berlin Machine Learning Group. "The EEG signal gets buried under all this noise – but our system is able to separate not only the EEG signal, but the frequency of the flickering LED within this signal. People with amyotrophic lateral sclerosis (ALS) [motor neuron disease or Lou Gehrig's disease], or high spinal cord injuries face difficulties communicating or using their limbs. Decoding what they intend from their brain signals could offer means to communicate and walk again."
According to the researchers, learning to operate the system was relatively simple and volunteers required only a few minutes to get the hang of operating it. The only downsides they noted were that participants suffering from epilepsy were excluded due to their possible reaction to the flashing LEDs, and operators suffered some degree of "visual fatigue" after long-term use. The researchers took these factors on board and slated them for further research into reducing or negating their effects.
"We were driven to assist disabled people, and our study shows that this brain control interface can easily and intuitively control an exoskeleton system – despite the highly challenging artefacts from the exoskeleton itself," said Professor Muller.
The team's paper is published in the Journal of Neural Engineering and the technology can be seen in action in the video below.
Source: Institute of Physics
Want a cleaner, faster loading and ad free reading experience?
Try New Atlas Plus. Learn more