Medical Devices

Mind reading – scientists translate brain signals into words

Mind reading – scientists translate brain signals into words
Larger, numbered button-like electrodes (ECoGs) alongside the microECoGs indicated by the 4x4 circle grid at the end of the green and orange wires on the brain of a volunteer patient (Image: University of Utah Department of Neurosurgery)
Larger, numbered button-like electrodes (ECoGs) alongside the microECoGs indicated by the 4x4 circle grid at the end of the green and orange wires on the brain of a volunteer patient (Image: University of Utah Department of Neurosurgery)
View 3 Images
An epileptic patient's brain is superimposed with the locations of two kinds of electrodes: conventional ECoG electrodes (yellow), and two grids (red) of 16 experimental microECoG electrodes used to read speech signals from the brain (Image: Kai Miller, University of Washington)
1/3
An epileptic patient's brain is superimposed with the locations of two kinds of electrodes: conventional ECoG electrodes (yellow), and two grids (red) of 16 experimental microECoG electrodes used to read speech signals from the brain (Image: Kai Miller, University of Washington)
An array of 16 microelectrodes – known as a microECoG grid – is arranged in a four-by-four array (Image: Spencer Kellis, The University of Utah)
2/3
An array of 16 microelectrodes – known as a microECoG grid – is arranged in a four-by-four array (Image: Spencer Kellis, The University of Utah)
Larger, numbered button-like electrodes (ECoGs) alongside the microECoGs indicated by the 4x4 circle grid at the end of the green and orange wires on the brain of a volunteer patient (Image: University of Utah Department of Neurosurgery)
3/3
Larger, numbered button-like electrodes (ECoGs) alongside the microECoGs indicated by the 4x4 circle grid at the end of the green and orange wires on the brain of a volunteer patient (Image: University of Utah Department of Neurosurgery)
View gallery - 3 images

Using the same technology that allowed them to accurately detect the brain signals controlling arm movements that we looked at last year, researchers at the University of Utah have gone one step further, translating brain signals into words. While the previous breakthrough was an important step towards giving amputees or people with severe paralysis a high level of control over a prosthetic limb or computer interface, this new development marks an early step toward letting severely paralyzed people speak with their thoughts.

Nonpenetrating microECoGs

For their study the research team placed grids of tiny microelectrodes over speech centers of the brain of a volunteer with severe epileptic seizures. These nonpenetrating microelectrodes, called microEC0Gs, are implanted beneath the skull but sit on top of the brain without poking into it. The volunteer already had a craniotomy – temporary partial skull removal – so doctors could place larger, conventional electrodes to locate the source of his seizures and surgically stop them.

An array of 16 microelectrodes – known as a microECoG grid – is arranged in a four-by-four array (Image: Spencer Kellis, The University of Utah)
An array of 16 microelectrodes – known as a microECoG grid – is arranged in a four-by-four array (Image: Spencer Kellis, The University of Utah)

Because the microelectrodes do not penetrate brain matter, they are considered safe to place on speech areas of the brain – something that cannot be done with penetrating electrodes that have been used in experimental devices to help paralyzed people control a computer cursor or an artificial arm. Additionally, EEG electrodes used on the skull to record brain waves are too big and record too many brain signals to be used easily for decoding speech signals from paralyzed people.

Each of two grids with 16 microECoGs spaced 1 millimeter (about one-25th of an inch) apart, was placed over one of two speech areas of the brain: First, the facial motor cortex, which controls movements of the mouth, lips, tongue and face - basically the muscles involved in speaking. Second, Wernicke's area, a little understood part of the human brain tied to language comprehension and understanding.

Translating nerve signals into words

Once in place the experimental microelectrodes were used to detect weak electrical signals from the brain generated by a few thousand neurons or nerve cells. During one-hour sessions conducted over four days the scientists recorded brain signals as the patient repeatedly read each of 10 words that might be useful to a paralyzed person: yes, no, hot, cold, hungry, thirsty, hello, goodbye, more and less. Each of the 10 words was repeated from 31 to 96 times, depending on how tired the patient was.Later, they tried figuring out which brain signals represented each of the 10 words. When they compared any two brain signals - such as those generated when the man said the words "yes" and "no" - they were able to distinguish brain signals for each word 76 percent to 90 percent of the time.

When they examined all 10 brain signal patterns at once, they were able to pick out the correct word any one signal represented only 28 percent to 48 percent of the time - better than chance (which would have been 10 percent) but not good enough for a device to translate a paralyzed person's thoughts into words spoken by a computer.

One unexpected finding: When the patient repeated words, the facial motor cortex was most active and Wernicke's area was less active. Yet Wernicke's area "lit up" when the patient was thanked by researchers after repeating words. It shows Wernicke's area is more involved in high-level understanding of language, while the facial motor cortex controls facial muscles that help produce sounds, Greger says.

The researchers were most accurate – 85 percent – in distinguishing brain signals for one word from those for another when they used signals recorded from the facial motor cortex. They were less accurate – 76 percent – when using signals from Wernicke's area. Combining data from both areas didn't improve accuracy, showing that brain signals from Wernicke's area don't add much to those from the facial motor cortex.

When the scientists selected the five microelectrodes on each 16-electrode grid that were most accurate in decoding brain signals from the facial motor cortex, their accuracy in distinguishing one of two words from the other rose to almost 90 percent.

In the more difficult test of distinguishing brain signals for one word from signals for the other nine words, the researchers initially were accurate 28 percent of the time - not good, but better than the 10 percent random chance of accuracy. However, when they focused on signals from the five most accurate electrodes, they identified the correct word almost half (48 percent) of the time.

"This is proof of concept," Bradley Greger, an assistant professor of bioengineering, "We've proven these signals can tell you what the person is saying well above chance. But we need to be able to do more words with more accuracy before it is something a patient really might find useful."

Next step

Now they know the technology works the team is looking to refine it."The obvious next step - and this is what we are doing right now - is to do it with bigger microelectrode grids" with 121 micro electrodes in an 11-by-11 grid, he says. "We can make the grid bigger, have more electrodes and get a tremendous amount of data out of the brain, which probably means more words and better accuracy."

People who eventually could benefit from a wireless device that converts thoughts into computer-spoken spoken words include those paralyzed by stroke, Lou Gehrig's disease and trauma, Greger says. People who are now "locked in" often communicate with any movement they can make – blinking an eye or moving a hand slightly – to arduously pick letters or words from a list.

The University of Utah team's study showing the feasibility of translating brain signals into computer-spoken words will be published in the September issue of the Journal of Neural Engineering.

View gallery - 3 images
No comments
0 comments
There are no comments. Be the first!