Join Us, In Saving Lives And Engineering The Future
We are a diverse team of talented people who are dedicated to building technology that radically improves human lives.
December 15, 2023
Live and On-Demand
Talk abstract:
The ability to communicate with our surroundings and interacting with it through grasping are invaluable to human independence. Diseases like amyotrophic lateral sclerosis (ALS), cerebral brain lesions and spinal cord injuries can lead to paralysis, affecting movement and speech processes. Brain-Machine interfaces (BMI’s) offer a promising technological pathway to help affected patients regain independence by reading signals directly from the brain and using these signals to control external devices or for communication. A promising target site for multimodal BMI applications is the posterior parietal cortex (PPC). We previously showed that the supramarginal gyrus (SMG) located in the PPC is highly modulated to grasp and vocalized speech processes. However, an ideal speech BMI should represent internal speech (also called imagined or covert speech), which is the act of talking inside your head, without associated movement. Results in internal speech decoding are sparse and have yet to achieve high functionality in real-time, but a BMI leveraging internal speech would be incredibly useful for the locked-in population. In this work, a C5 – C6 tetraplegic patient implanted with Utah arrays in SMG performed an internal and vocalized speech task. A real-time decoding task using only data recorded during internal speech was implemented and reached up to 91% accuracy of eight words. Evidence for both phonetic and semantic language representation were found by decoding words with identical semantic meanings and homonyms. In this work, we show robust internal speech modulation within an area of the SMG that is also involved in grasping processes, providing proof-of-concept that multimodal BMIs can be built using multielectrode arrays implanted in a single brain area.
About the speaker:
Dr. Sarah Wandelt is a neuroscientist whose main focus lies in using brain-machine interfaces to recover hand movement and communication in people affected by tetraplegia. Under the mentorship of Professor Richard Andersen at the California Institute of Technology, she established that certain regions within the posterior parietal cortex highly represent grasp positions, as well as language. These findings led to the development of the first real-time internal speech brain-machine interface.