Principal Investigator

Lizabeth M. Romanski, Ph.D. University of Rochester work Box 603 601 Elmwood Ave Rochester NY 14642 office: MC 6-8539 p (585) 273-1469 f (585) 276-5334

Contact

Romanski Lab University of Rochester p (585) 273-1469 f (585) 276-5334

Affiliations

Prefrontal Cortex Research

Romanski Lab

The integration of auditory and visual stimuli is crucial for recognizing objects by sight and sound, communicating effectively, and navigating through our complex world. While auditory and visual information are combined in many sites of the human brain, the frontal lobes have been identified as a region associated with memory and language, which depend on multisensory integration of complex auditory and visual stimuli. In our laboratory we are interested in how the ventral prefrontal cortex represents high level auditory information and the neuronal mechanisms which underlie integration of complex auditory and visual information, primarily face and vocal information during communication. Studies in our laboratory have shown that neurons within specific regions of the ventral prefrontal cortex are robustly responsive to complex sounds including species-specific vocalizations, while previous studies have shown that adjacent ventral prefrontal regions are selectively responsive to faces.

The functional organization of the primate prefrontal cortex. Regions of the frontal lobe are color coded according to function and connectivity. Areas involved in visual spatial working memory in the frontal lobe are blue and receive a robust projection from the posterior parietal cortex, also colored blue. In contrast, the inferotemporal cortex (orange) sends information about visual objects, faces and patterns to areas 45 and 12/47 in the ventral prefrontal cortex (VLFPC), also colored orange. Neurons in the VLPFC have been shown to be responsive to object properties and to faces. Studies in our laboratory have documented projections from the anterior auditory cortex (yellow) in the superior temporal gyrus that project to anterior and ventral prefrontal cortex (yellow), area 12/47. Prefrontal auditory neurons in this have been shown to respond to species-specific vocalizations. Finally, multisensory neurons have been found in the VLPFC in areas 12/47 and 45 that are responsive to combinations of faces and their corresponding vocalizations.

Recently we have demonstrated that neurons within ventral prefrontal cortex are multisensory and respond to both faces and to the corresponding vocalizations. We have begun to explore some of the factors that affect the integration of dynamic faces and vocalizations in the frontal lobe including temporal coincidence, stimulus congruence, as well as the emotional expression conveyed in the face-vocalization and the identity of the speaker. To this end we have examined changes in neural activity when face-vocalization pairs are mismatched either in content or temporal sequence.

Coronal section through the prefrontal cortex.

Our data indicates that incongruent and temporally offset stimuli evoke changes in single prefrontal neurons in both active and passive task conditions. Single unit recording demonstrate that prefrontal neurons can encode these mismatches with changes in response latency and/or changes in response magnitude. Additional studies are aimed at determining the connections of face/vocalization processing areas in the frontal lobe with unimodal and polymodal processing regions of the temporal lobe. Further analysis of the neural mechanisms which support face and voice integration in non-human primates may help us to understand the mechanisms underlying social communication and social cognition.