Skip to main content
Explore URMC

URMC Logo

menu
URMC / Labs / Bennetto Lab / Projects / Past Projects / Brain Activity During Speech Comprehension in Autism

Brain Activity During Speech Comprehension in Autism

Illustration of Hand Motions

Example of stimulus used to study audiovisual integration. This figure depicts
still frame images of frames of the movie clip, with motion of the hand depicted in
blue. At the bottom are the target (right) and foil (left) for a looped line
with one loop in it.

Recent research suggests that people with autism have difficulty integrating information from multiple sensory inputs. Our previous studies showed that children with autism have difficulty using visual information, such as lip movements and gestures, to help with language comprehension. The current study builds on these findings by examining brain activity associated with auditory and visual processing in autism. We are exploring how the brain processes what we see and hear and how we integrate speech and gestures to understand others. This will help us better understand how nonverbal information influences language comprehension in children with autism.

Illustriation of brain mri

Cortical regions with significant BOLD signal change for typical
adult participant group while viewing meaningful gestures as opposed
to still images red, and while viewing meaningless gestures as
opposed to still images (blue). Areas of shared activation across
meaningful and meaningless condition are shown in purple.

We will also study a part of the brain called the mirror neuron system, which is considered to be important in understanding the meaning of others’ actions. In this study, we will be looking at changes in brain activity during language comprehension tasks using a noninvasive tool called magnetic resonance imaging (MRI). This shows us which brain areas are important during speech and gesture comprehension, and will help us figure out whether some regions work in different ways in children with autism.

This study is being funded by Autism Speaks.

« back to all projects