Science

A more user-friendly brain-machine interface

A more user-friendly brain-machine interface
Scientists are creating a brain-computer interface that will allow users to control devices, without having to continuously concentrate on doing so (Image: EPFL)
Scientists are creating a brain-computer interface that will allow users to control devices, without having to continuously concentrate on doing so (Image: EPFL)
View 1 Image
Scientists are creating a brain-computer interface that will allow users to control devices, without having to continuously concentrate on doing so (Image: EPFL)
1/1
Scientists are creating a brain-computer interface that will allow users to control devices, without having to continuously concentrate on doing so (Image: EPFL)

Practical thought-controlled devices, such as wheelchairs, artificial arms, or even cars, are perhaps a step closer to reality thanks to research being carried out at Switzerland's Ecole Polytechnique Fédérale de Lausanne (EPFL). Traditionally, brain-computer interfaces require the user to concentrate on constantly maintaining a mental command of either turn left, turn right, or no-command (go straight). According to EPFL, most users can't sustain more than about an hour of the necessary mental effort. The school is developing a new system, however, that allows users to take breaks and shift their attention to other things while their thought-controlled device continues to operate on its own.

Like other brain-computer interfaces, EPFL's system utilizes EEG readings obtained from a network of sensors on the user's scalp. What makes it different is that it uses statistical analysis – or probability theory – when processing those readings, allowing it to learn what the user expects of it. When combined with a Shared Control system, which uses cameras and sensors to augment the thought-control system, it can do things like avoiding obstacles or continuing on in a straight line, without constant mental updates from the user.

The system was recently demonstrated at the AAAS (American Association for the Advancement of Science) 2011 Annual Meeting in Washington, D.C. Volunteers were hooked up to it, then asked to read silently, speak, or read aloud, while simultaneously sending out as many left, right, or no-commands as possible. The system was reportedly able to filter out the extraneous mental "noise," and pick up the commands intended for it. Theoretically, this means that a wheelchair user could utilize the system to make their way through a building, while also doing something such as carrying on a phone conversation.

While the system won't be in production anytime soon, it has already moved into clinical trials. The researchers are currently working on further integrating it with Shared Control systems, and improving its interpretation of cognitive information.

Multitasking with BCI Machines

No comments
0 comments
There are no comments. Be the first!