As indicated by the name, video games are a very visual medium. But that hasn't stopped participants in a study at the University of Washington (UW) successfully playing through a game without ever actually looking at it, hearing it or using any of the standard five senses. Instead, they were guided through virtual mazes via direct brain stimulation, in a demonstration of technology that could one day form the basis of sensory prosthetics to help visually-impaired people navigate the real world, or provide a new way for anyone to interact with virtual ones.
The five players taking part in the UW study interacted with the game through a process known as transcranial magnetic stimulation, where a magnetic coil is placed on the back of the skull to directly stimulate certain parts of the brain safely and painlessly. This technique has shown the potential to treat migraines, aid learning, improve memory and allow direct brain-to-brain communication.
In this case, the players' brains were stimulated to guide them through 21 simple mazes in the game. Each step of the way offered a binary choice: walk forward or climb down a ladder. When an obstacle appeared in front of them, the device would stimulate the brain and create a visual artefact called a phosphene – which essentially means they saw a light when there was none. If the player sensed a phosphene, they needed to move downwards, while an absence indicated that the path ahead was clear.
"The way virtual reality is done these days is through displays, headsets and goggles, but ultimately your brain is what creates your reality," says Rajesh Rao, senior author of the study. "The fundamental question we wanted to answer was: can the brain make use of artificial information that it's never seen before that is delivered directly to the brain to navigate a virtual world or do useful tasks without other sensory input? And the answer is yes."
Using this method, the players managed to make the correct moves 92 percent of the time, compared to a success rate of just 15 percent without it. Their abilities improved over time too, indicating that the subjects were able to make more sense of the unfamiliar input.
"We're essentially trying to give humans a sixth sense," says lead author, Darby Losey. "So much effort in this field of neural engineering has focused on decoding information from the brain. We're interested in how you can encode information into the brain."
While the input came from a simple game in this experiment, the researchers say that eventually this artificial sixth sense could be hooked up to a wide range of other sensors, like cameras or rangefinders, to alert visually-impaired people to obstacles in their path.
"The technology is not there yet — the tool we use to stimulate the brain is a bulky piece of equipment that you wouldn't carry around with you," says co-author, Andrea Stocco. "But eventually we might be able to replace the hardware with something that's amenable to real world applications."
The next step for the team is to experiment with creating more complex cues, either visually or through other senses, by fiddling with the intensity and location of the brain stimulation. Eventual applications may even extend to adding other senses to virtual and augmented reality.
"We look at this as a very small step toward the grander vision of providing rich sensory input to the brain directly and non-invasively," says Rao. "Over the long term, this could have profound implications for assisting people with sensory deficits while also paving the way for more realistic virtual reality experiences."
The research was published in the journal Frontiers in Robotics and AI, and the technique can be seen in action in the video below.
Source: University of Washington