Visual Speech App Could Help With Communication

Keene State College professor Dr. Lawrence Welkowitz and his app to aid people with autism. (Keene Sentinel - Michael Moore)

People with autism tend to have a hard time communicating since many of them have a hard time expressing emotion as well as interpreting emotions making having a conversation very difficult.

Psychology professor of Keen State College, Lawrence Welkowitz, conducts research that aims to help people on the autism spectrum understand and imitate the subtle patterns in speech. He is now working on an iPad app that shows a person’s speech represented in sound waves. This visual representation would help people to see and match how emotions are conveyed in their speech.

Professor Welkowitz is working with neurologists  at Dartmouth’s Geisel School of Medicine to conduct studies testing this new app idea. The study involves having  high-functioning adults on the spectrum use the app while researchers measure their brain activity on an MRI scan. The tests are meant to show whether or not someone on the autism spectrum is able to learn new speech patterns through this method. Though researchers have said that results were promising so far, but there is still a lot of work to be done.

Welkowitz’s former research assistant at Keene State, Josh Green says “visuals are good for autistic people who tend to be more literal, concrete, visual.” As a grad student at the University of Connecticut, Green is currently doing his own research on speech development and autism. He and Welkowitz originally designed the app with the help of Museami, a digital company, naming the app SpeechMatch.

A person on the spectrum can use the app to listen and see the sound waves of their speech for a variety of phrases being said with different emotions or tones. They then try to match their own voice and tone to the phrases and can see if the sound waves line-up. The app calculates a numerical score to tell the person how close or far they are from matching the sound waves of the original phrase.

Dr. Robert Roth, professor of psychology and part of the Dartmouth team, says “When they’re processing language [there’s] quite a lot of activation in the left brain areas,” which control more concrete thinking. What they’re really looking for is activation in the amygdala, which the emotional control center in the brain. This app could prove to very beneficial to people with ASD in the future, however a lot more research and funding is needed before an official app is released.

The third day of ICare4Autism’s International Autism Conference on July 2nd will be completely focused on new technological developments and ways that technology is opening doors for people on the spectrum. To find out more and register for the conference, CLICK HERE