Prefrontal Cortex Research
The integration of auditory and visual stimuli is crucial for recognizing objects by sight and sound, communicating effectively, and navigating through our complex world. While auditory and visual information are combined in many sites of the human brain, the frontal lobes have been identified as a region associated with memory and language, which depend on multisensory integration of complex auditory and visual stimuli. In our laboratory we are interested in how the ventral prefrontal cortex represents high level auditory information and the neuronal mechanisms which underlie integration of complex auditory and visual information, primarily face and vocal information during communication. Studies in our laboratory have shown that neurons within specific regions of the ventral prefrontal cortex are robustly responsive to complex sounds including species-specific vocalizations, while previous studies have shown that adjacent ventral prefrontal regions are selectively responsive to faces.
The functional organization of the primate prefrontal cortex. Regions of the frontal lobe are color coded according to function and connectivity. Areas involved in visual spatial working memory in the frontal lobe are blue and receive a robust projection from the posterior parietal cortex, also colored blue. In contrast, the inferotemporal cortex (orange) sends information about visual objects, faces and patterns to areas 45 and 12/47 in the ventral prefrontal cortex (VLFPC), also colored orange. Neurons in the VLPFC have been shown to be responsive to object properties and to faces. Studies in our laboratory have documented projections from the anterior auditory cortex (yellow) in the superior temporal gyrus that project to anterior and ventral prefrontal cortex (yellow), area 12/47. Prefrontal auditory neurons in this have been shown to respond to species-specific vocalizations. Finally, multisensory neurons have been found in the VLPFC in areas 12/47 and 45 that are responsive to combinations of faces and their corresponding vocalizations.
Recently we have demonstrated that neurons within ventral prefrontal cortex are multisensory and respond to both faces and to the corresponding vocalizations. We have begun to explore some of the factors that affect the integration of dynamic faces and vocalizations in the frontal lobe including temporal coincidence, stimulus congruence, as well as the emotional expression conveyed in the face-vocalization and the identity of the speaker. To this end we have examined changes in neural activity when face-vocalization pairs are mismatched either in content or temporal sequence.
Our data indicates that incongruent and temporally offset stimuli evoke changes in single prefrontal neurons in both active and passive task conditions. Single unit recording demonstrate that prefrontal neurons can encode these mismatches with changes in response latency and/or changes in response magnitude. Additional studies are aimed at determining the connections of face/vocalization processing areas in the frontal lobe with unimodal and polymodal processing regions of the temporal lobe. Further analysis of the neural mechanisms which support face and voice integration in non-human primates may help us to understand the mechanisms underlying social communication and social cognition.
- Coding of vocalizations by single neurons in ventrolateral prefrontal cortex. Hear Res. 305, 135-43. (2013 Nov 01).
- Faces in motion: selectivity of macaque and human face processing areas for dynamic stimuli. J Neurosci. 33, 11768-73. (2013 Jul 17).
- Timing of audiovisual inputs to the prefrontal cortex and multisensory integration. Neuroscience. 214, 36-48. (2012 Jul 12).