Join Us, In Saving Lives And Engineering The Future
We are a diverse team of talented people who are dedicated to building technology that radically improves human lives.
August 16, 2025
blackrock-jessn
Authors: Justin J. Jude, Stephanie Haro, Hadar Levi-Aharoni, Hiroaki Hashimoto, Alexander J. Acosta, Nicholas S. Card, Maitreyee Wairagkar, David M. Brandman, Sergey D. Stavisky, Ziv M. Williams, Sydney S. Cash, John D. Simeral, Leigh R. Hochberg, Daniel B. Rubin
Abstract: Intracortical brain-computer interfaces (iBCIs) for decoding intended speech have provided individuals with ALS and severe dysarthria an intuitive method for high-throughput communication. These advances have been demonstrated in individuals who are still able to vocalize and move speech articulators. Here, we decoded intended speech from an individual with longstanding anarthria, locked-in syndrome, and ventilator dependence due to advanced symptoms of ALS. We found that phonemes, words, and higher-order language units could be decoded well above chance. While sentence decoding accuracy was below that of demonstrations in participants with dysarthria, we are able to attain an extensive characterization of the neural signals underlying speech in a person with locked-in syndrome and through our results identify several directions for future improvement. These include closed-loop speech imagery training and decoding linguistic (rather than phonemic) units from neural signals in middle precentral gyrus. Overall, these results demonstrate that speech decoding from motor cortex may be feasible in people with anarthria and ventilator dependence. For individuals with longstanding anarthria, a purely phoneme-based decoding approach may lack the accuracy necessary to support independent use as a primary means of communication; however, additional linguistic information embedded within neural signals may provide a route to augment the performance of speech decoders.
Read the full pre-print here.