April 2, 2009 Honda has taken some very significant steps into what could be an absolute revolution in human-computer interface. Honda Research Institute, Japan, has demonstrated a Brain-Machine Interface (BMI) that enables a user to control an ASIMO robot using nothing more than thought. Wearing a headset containing both electroencephalography (EEG) and near-infrared spectroscopy (NIRS) sensors, the user simply imagines moving either his right hand, left hand, tongue or feet - and ASIMO makes a corresponding movement. The system is still huge and slow, and the commands are quite crude and imprecise - but Honda's baby steps represent a huge leap in technology. The next step is to refine the system to work with fine motor controls, add the ability to decode non-motor brain signals and speed it all up. Then, the doors will be open for a whole range of machines that can sense your thoughts, intentions and feelings, and act directly upon them. BMI has staggering potential - this is just the beginning.
The keyboard, mouse and gamepad aren't going anywhere any time soon - but Honda's BMI brain-reading technology, still in its infancy, could open up some very exciting options in the future. We've already seen toys like the Force Trainer hitting the market that can sense relaxation patterns in the brain and reward them - now Honda is showing how the idea can be extended to incorporate more precise controls.
Honda's original experiments in 2006, using functional MRI scanning of the brain, had one huge problem to overcome - the huge size and dangerous magnetic fields used in MRI scanning severely limited the locations and applications in which the technology could be used.
The more recent BMI unit uses a combination of EEG (which measures changes in electrical potential on the scalp) and NIRS (which measures cerebral blood flow) to develop a picture of the brain's activity in real time. The user puts the headset on, and imagines moving either their left hand, right hand, tongue or feet - without actually moving that body part.
The user's EEG and NIRS readings are processed and statistically analyzed by a computer, with tests showing impressive accuracy - the signals were correctly interpreted more than 90% of the time. Finally, in a bit of showmanship, Honda's famous ASIMO robot goes on to move the body part the user was imagining.
Honda are currently viewing the BMI technology as an interface into its robotics and articificial intelligence products - mind-controlled household assistant robots and the like. But once this non-invasive, safe technology becomes more refined and precise, there's no reason why it shouldn't be applied much more widely - and then, imagination is the only limit... mind-controlled movies and video games that can sense when you're scared, or offended, or bored, and adjust their content levels to keep you in the right zone. Climate control systems that can sense when you think it's too hot, and turn on the aircon or have a drink brought to you.
And of course, what we're all really waiting for: magnificent sex robots that can read and recreate your every fantasy, or somehow let you make mind-controlled whoopie with yourself. Actually, that one could cause some problems; no one would get any work done at the RIAA.
See the video below to watch the current state of Honda's Brain-Machine Interface technology, as demonstrated by the ever-friendly ASIMO.
Loz Blain