Robotics

Mind-controlled robot avatars inch towards reality

Mind-controlled robot avatars inch towards reality
A researcher minds the robot's balance as it is commanded to pick up a canned drink by an operator (off camera)
A researcher minds the robot's balance as it is commanded to pick up a canned drink by an operator (off camera)
View 5 Images
The robot's field of view can be directed by concentrating on one of the blinking circles overlaid on its camera image
1/5
The robot's field of view can be directed by concentrating on one of the blinking circles overlaid on its camera image
Object recognition software automatically detects and highlights the bottled water and canned drink in the robot's camera images, and by focusing on one of them the patient can command the robot to retrieve it
2/5
Object recognition software automatically detects and highlights the bottled water and canned drink in the robot's camera images, and by focusing on one of them the patient can command the robot to retrieve it
A researcher minds the robot's balance as it is commanded to pick up a canned drink by an operator (off camera)
3/5
A researcher minds the robot's balance as it is commanded to pick up a canned drink by an operator (off camera)
An operator can move the robot in different directions by concentrating on the flashing icons
4/5
An operator can move the robot in different directions by concentrating on the flashing icons
Electrical signals picked up by the EEG cap are interpreted by a signal processor as specific intentions, which are then sent as commands to the robot
5/5
Electrical signals picked up by the EEG cap are interpreted by a signal processor as specific intentions, which are then sent as commands to the robot
View gallery - 5 images

Researchers at the CNRS-AIST Joint Robotics Laboratory (a collaboration between France's Centre National de la Recherche Scientifique and Japan's National Institute of Advanced Industrial Science and Technology) are developing software that allows a person to drive a robot with their thoughts alone. The technology could one day give a paralyzed patient greater autonomy through a robotic agent or avatar.

The system requires that a patient concentrate their attention on a symbol displayed on a computer screen (such as a flashing arrow). An electroencephalography (EEG) cap outfitted with electrodes reads the electrical activity in their brain, which is interpreted by a signal processor. Finally, the desired command is sent to the robot to carry out.

The system does not provide direct fine-grain motor control: the robot is simply performing a preset action such as walking forward, turning right or left, and so on. The robot's artificial intelligence, developed over several years at the lab, allows it to perform more delicate tasks such as picking up an object from a table without needing human input. In this scenario, the robot's camera images are parsed by object recognition software, allowing the patient to choose one of the objects on a table by focusing their attention on it.

Object recognition software automatically detects and highlights the bottled water and canned drink in the robot's camera images, and by focusing on one of them the patient can command the robot to retrieve it
Object recognition software automatically detects and highlights the bottled water and canned drink in the robot's camera images, and by focusing on one of them the patient can command the robot to retrieve it

With training, the user can direct the robot's movements and pick up beverages or other objects in their surroundings. The system can be seen in use in the DigInfo video at the bottom of the page.

This is similar to but more sophisticated than previous projects, one involving Honda's ASIMO robot from 2006, and another at the University of Washington from 2007.

A different but more direct approach would be to track a patient's eye movements. Recent research conducted at the Université Pierre et Marie Curie-Paris enabled cursive writing on a computer screen through eye movement alone. The same technology could allow a patient to move a cursor and select from a multitude of action icons without having to go through the EEG middle-man. The hitch is that – in some circumstances – eye movement isn't possible or can't be tracked reliably due to eye conditions. In that case, brain implants may be the way to go.

No matter how you slice it, researchers aren't giving up, and with further progress robot avatars may cease being the stuff of science fiction. No doubt patients would feel empowered and liberated by this technology, but it will be awhile before it can be implemented, and the robots being deployed will likely look more like Toyota's recently unveiled Human Support Robot than advanced bipedal robots.

Source: AIST-CNRS JRL (Japanese) via DigInfo News

Mind controlled android robot - Researchers working towards robotic re-embodiment #DigInfo

View gallery - 5 images
No comments
0 comments
There are no comments. Be the first!