Brainwave-to-speech system under study

Grant funds research into aid for communication

Download photo

Edlyn Peña and her team will analyze Think to Speak, which records brainwave patterns and uses them to trigger pre-programmed words and phrases that are spoken aloud on an iPad, iPhone or iPod Touch.

(THOUSAND OAKS, Calif. – Feb. 16, 2017) A California Lutheran University faculty member has received a $67,000 grant to assess whether a new brain-computer interface can efficiently help nonverbal people to speak.

Edlyn Peña, co-director of Cal Lutheran’s Autism and Communication Center, received the one-year grant from the Disability Communications Fund. The research ties into a fund priority to encourage the development of technologies that serve the communication needs of Californians with disabilities. 

Peña, center co-director Beth Brennan, Agoura Hills speech-language pathologist Ali Steers and Kirsten Brown of the University of Wisconsin-Madison will analyze Think to Speak, which was developed by Santa Barbara-based Smartstones Inc. Pairing an EEG brainwave-sensing headset with a mobile communication app called :prose, the system records brainwave patterns and uses them to trigger pre-programmed words and phrases that are spoken aloud on an iPad, iPhone or iPod Touch.

Brain-computer interface systems such as Think to Speak could be particularly valuable for nonverbal people who have severely limited motor control, potentially offering a hands-free way to communicate. Most current technologies that help people speak require users to perform physical tasks, such as touching icons, but the inability to produce speech is often due to motor-control limitations.

This type of brain-interface technology could also help all nonverbal people by providing a quicker, more fluid way to communicate with an expanded vocabulary that isn’t limited by icon-based software.

While promising, the technology is in its infancy and not widely available.

More than 2 million United States residents have complex communication needs. Around 35 percent of children with autism spectrum disorders speak minimally or not at all. Brain-computer interfaces could also help stroke survivors and people with Lou Gehrig’s disease, cerebral palsy, traumatic brain injuries and aphasia.

Peña, an associate professor and director of doctoral studies in Cal Lutheran’s Graduate School of Education, is recruiting people to participate in the study. Five participants will be non-disabled people whose speech meets daily communication needs, five will be clients of a nonprofit assisted-living facility in Santa Barbara who have developmental disabilities and complex motor-control and communication challenges, and five will be individuals with autism and complex motor-control and communication challenges who use traditional augmentative and alternative communication tools.

For more information on the study or center, email or visit