Wednesday, December 11, 2019

Speech Decoded from Brain Activity in Area for Hand Control

resposted from

Speech Decoded from Brain Activity in Area for Hand Control

The surprising finding comes courtesy of two study participants with implanted electrode arrays that record activity at single-neuron resolution.

Dec 10, 2019
SHAWNA WILLIAMS
635
ABOVE: A computer rendering of electrode arrays implanted in the hand knob area of the brain
JAIMIE HENDERSON/STANFORD UNIVERSITY
In the 1930s, neurosurgeon Wilder Penfield and his colleagues put out a model for understanding how the brain controls movement that they termed the motor homunculus. The central idea was that different parts of a brain area called the precentral gyrus are charged with moving the hands, legs, face, and so on. While the idea was recognized as a simplification from the beginning, and has been further complicated by subsequent studies, it remains true in neuroscience that different areas of the gyrus specialize in controlling specific parts of the body.
One of those areas is the “hand knob,” which, as the name suggests, is a knobby region of the gyrus involved in hand and arm movements. It also has another, surprising function. A team based at Stanford University reports today (December 10) in eLife that some neurons there are active during speech, and their signals can be decoded to reveal the word or sound uttered. The results lend new insight into brain organization, and could be useful in devising future brain-computer interfaces that would enable communication by people who cannot talk.
“I’m really excited about the paper,” says Edward Chang, a neurosurgeon at the University of California, San Francisco, who works on brain-computer interfaces but was not involved in the new study. “It raises questions about how exclusive the assignment of function to a given area is. . . . I think that [lack of exclusivity is] something that we haven’t fully appreciated until now.”

See “Computer Program Converts Brain Signals to a Synthetic Voice

The research was conducted as part of the BrainGate2 study, a long-running trial in which participants with quadriplegia have been implanted with sensors in their motor cortices that can record the activity and locations of specific neurons. Researchers are working with the participants to hone the ability of a brain-computer interface to interpret neuronal activity into assisted movement, such as directing an on-screen cursor.
Because some of the participants have the sensors embedded in the hand knob, Sergey Stavisky, a neuroscientist at Stanford, figured this offered an opportunity to explore what—if anything—the area is involved in beyond arm movement. Specifically, he focused on speech. “We know from lesion studies, like if someone has a stroke, that if you damage this area, you can’t move your hand, but you can still talk,” which demonstrates that the region isn’t crucial for speech. But this and other available types of data, such as those from fMRI or electrocorticography studies, give a relatively course look at brain activity, Stavisky says.
So Stavinsky, Stanford neurosurgeon Jaimie Henderson, Howard Hughes Medical Institute investigator Krishna Shenoy of Stanford, and their colleagues worked with two BrainGate2 participants who had sensors inside the hand knob. While recording activity from the implants, they asked the participants to say certain words, or parts of words known as phonemes, from prepared lists. Using a computer to analyze the recorded brain activity later, the researchers were able most of the time to correctly discern which word or phoneme was spoken. In one participant, they accurately decoded which of 10 words were spoken 85 percent of the time, while in the other, whose sensor Stavinsky says didn’t work as well, the accuracy was about 55 percent.

See “Meet An Artist With No Hands

Speech is typically thought of as controlled by the inner, or ventral, part of the brain, notes Jonathan Brumberg, a University of Kansas neuroscientist who studies speech and language in the brain and was not involved in the study. While some evidence has suggested that outer, or dorsal, areas of the brain are also involved in speech, “this is one of the first papers to really provide direct evidence for that,” he says. It’s not yet clear why speech would have corresponding activity in the hand and arm control area, he adds, and the results raise the question, “what is the role of this dorsal area [in speech] if it is doing something separate than the ventral area?” 
One possibility the research team considered is that speech activity doesn’t occur in the hand knob under ordinary circumstances, but had “remapped” there due to the participants’ paralysis. A team led by Shenoy and Henderson separately found that imagined movement all over the body—not just in the arms and hands—is represented in the hand knob. 
Ultimately, Stavinsky says, he hopes to help develop a speech prosthetic for people who have lost the ability to talk. Finding speech-related activity in a new place could make working toward that goal easier, he says, because “it gives us more places from which to pull [the signals] out.” 
S.D. Stavinsky et al., “Neural ensemble dynamics in dorsal motor cortex during speech in people with paralysis,” eLife, doi:10.7554/eLife.46015, 2019
Shawna Williams is an associate editor at The Scientist. Email her at swilliams@the-scientist.com or follow her on Twitter @coloradan.

No comments:

Post a Comment