Researchers at the CNRS-AIST Joint Robotics Laboratory (a collaboration between France's Centre National de la Recherche Scientifique and Japan's National Institute of Advanced Industrial Science and Technology) are developing software that allows a person to drive a robot with their thoughts alone. The technology could one day give a paralyzed patient greater autonomy through a robotic agent or avatar.
The system requires that a patient concentrate their attention on a symbol displayed on a computer screen (such as a flashing arrow). An electroencephalography (EEG) cap outfitted with electrodes reads the electrical activity in their brain, which is interpreted by a signal processor. Finally, the desired command is sent to the robot to carry out.
UPGRADE TO NEW ATLAS PLUS
More than 1,200 New Atlas Plus subscribers directly support our journalism, and get access to our premium ad-free site and email newsletter. Join them for just US$19 a year.UPGRADE
The system does not provide direct fine-grain motor control: the robot is simply performing a preset action such as walking forward, turning right or left, and so on. The robot's artificial intelligence, developed over several years at the lab, allows it to perform more delicate tasks such as picking up an object from a table without needing human input. In this scenario, the robot's camera images are parsed by object recognition software, allowing the patient to choose one of the objects on a table by focusing their attention on it.
Object recognition software automatically detects and highlights the bottled water and canned drink in the robot's camera images, and by focusing on one of them the patient can command the robot to retrieve it
With training, the user can direct the robot's movements and pick up beverages or other objects in their surroundings. The system can be seen in use in the DigInfo video at the bottom of the page.
A different but more direct approach would be to track a patient's eye movements. Recent research conducted at the Université Pierre et Marie Curie-Paris enabled cursive writing on a computer screen through eye movement alone. The same technology could allow a patient to move a cursor and select from a multitude of action icons without having to go through the EEG middle-man. The hitch is that – in some circumstances – eye movement isn't possible or can't be tracked reliably due to eye conditions. In that case, brain implants may be the way to go.
No matter how you slice it, researchers aren't giving up, and with further progress robot avatars may cease being the stuff of science fiction. No doubt patients would feel empowered and liberated by this technology, but it will be awhile before it can be implemented, and the robots being deployed will likely look more like Toyota's recently unveiled Human Support Robot than advanced bipedal robots.