As machines get more and more sophisticated, the mental capacity of their human overlords stays at a static (albeit seemingly impressive) level, and therefore slowly starts to pale in comparison. The bandwidth of the human brain is not limitless, and if an overloaded brain happens to be overseeing machines carrying out potentially dangerous tasks, you can expect trouble. But why had we built the machines in the first place, if not to save us from trouble? Brainput, a brain-computer interface built by researchers from MIT and Tufts University, is going to let your computer know if you’re mentally fit for the job at hand. If it decides your brain is overloaded with tasks, it will help you out by handling some of them for you.
Brainput uses a neuroimaging technology called functional near-infrared spectroscopy (fNIRS). This noninvasive, wearable brain-computer interface monitors the blood flow (blood oxygenation and volume) in the front part of the wearer’s brain by measuring changes in near-infrared light. Essentially, this allows Brainput to infer whether the wearer is multitasking. The data on the level of brain activation is constantly fed to a computer which automatically decreases the workload at times when the mental capacity is insufficient.
GET 30% OFF NEW ATLAS PLUS
Read the site and newsletter without ads. Use the coupon code EOFY before June 30 for 30% off the usual price.BUY NOW
So far, a group of researchers lead by MIT’s Erin Treacy Solovey tested Brainput by using it to instruct virtual robots on when to seamlessly adapt to their human controllers’ current cognitive state. Each of the controllers, wearing head-mounted fNIRS equipment, was to guide two different robots through a maze towards a spot where the WiFi signal was strong enough to send a message. Left unattended, the robots would just crash into walls, so the controllers had to constantly switch their attentions between one robot and the other, all the while keeping track of both their locations and routes.
While the humans struggled to keep their virtual robotic pupils from crashing into walls and attempted to lead them towards a target, the brain activity recorded by Brainput was being transferred to the computer and used to instruct the robots. One state of mind the robots were able to distinguish is called branching, and it can be observed when the subject is working simultaneously on two tasks that require a heightened amount of attention. Whenever the controller was caught branching (not to be mistaken with “brunching”), the robots would react by taking matters into their own robotic hands and switching to partially autonomous navigation.
As it turns out, the seamless falling in and out of an autonomous state contributed to an overall increase in the effectiveness of the human-robot cooperation. At least partially, this is due to the fact that humans busy with multitasking would normally never notice the robots actually did part of the job by themselves.
Of course there are other ways for a machine to establish whether the human controlling it is overworked or has trouble focusing, such as measuring typing speed or keeping track of how often typos occur. However, by measuring the brain activity directly, Brainput goes a step further, opening the way to some pretty interesting potential uses.
First off, Brainput could spell hell for disinterested university students, as their lecturer would be instantly notified if they suddenly started “multitasking” by secretly texting under their desks. On the other hand, should at any point a whole group of students start sending “overload” signals, this would provide invaluable feedback to the teacher (about his ability, or that of his students).
But that’s just one potential field where Brainput could be used. It could also be harnessed in a variety of ways to help us make up for the nonverbal cues that bring so much into face-to-face encounters and that are so difficult to recreate in the digital world. Brain activity signals instead of emoticons – is that the future?
The researchers themselves point at the potential to help pilots, drivers and UAV supervisors in a way that would make the human-computer tandem work in unison. Treacy Solovey also says that the team is going to look into other cognitive states that could potentially be measured using the fNIRS technology.