Skip to main content
menu
URMC / Labs / Romanski Lab / Projects / Audiovisual Mnemonic Processing

 

Audiovisual Mnemonic Processing

AV Task - Romanski Lab

The audiovisual non-match-to-sample task is depicted in the cartoon. The subject presses a button when either the face or the vocalization does not match the sample stimulus. In our variant of the task some trials had a visual mismatch, some an audio mis-match and for some both the face and voice were different from the sample.

Our previous work indicates that neurons in the primate ventrolateral prefrontal cortex (VLPFC) are multisensory and respond to combined audio-visual communication stimuli. To further investigate the role of the VLPFC in sensory integration we have recorded prefrontal neuronal activity while animals performed a discrimination task with face-vocalization stimuli as the memoranda. We recorded single-cell activity from previously identified multisensory regions in the VLPFC while monkeys performed a non-match to sample task.

In this task, a face-vocalization movie was presented as the sample and a variation of the face-voice stimulus was presented as the non-match stimulus. We altered either the auditory or visual track of the vocalization movie as the non-match stimulus. The non-match stimuli included either incongruent face-voice pairs which were mismatched in terms of content or identity or congruent face-voice pairs which matched but were different from the sample congruent face-voice pair. Neuronal activity during performance of this paradigm resulted in both task related and stimulus related changes in neuronal firing compared to baseline. Cells were found that showed activity in various phases of the task. Some cells were affected by the stimulus shown in the sample period. Interestingly a number of cells showed changes in neuronal firing that depended on the nature of the non-match stimulus.

For example, some cells were sensitive to changes of the auditory (vocalization) component while others were more affected by alterations of the visual stimulus. In addition, whether a non-match stimulus was congruent or incongruent had a significant effect on a proportion of cells with congruent stimuli evoking either a significant increase or decrease when compared to the incongruent stimuli during the non-match period. Continued analysis and recordings are aimed at determining the role of the VLPFC in the integration of audio-visual communication information and the importance of stimulus congruency to the process of multisensory integration in the primate frontal lobe.

« back to all projects