Principal Investigators

Gary D. Paige, M.D., Ph.D. University of Rochester work Box 603 601 Elmwood Ave Rochester NY 14642 office: MC 5-7425A p 585-275-6395 f 585-442-9480
William E. O'Neill, Ph.D. University of Rochester work Box 603 601 Elmwood Ave Rochester NY 14642 office: MC 6-8531 p 585-275-4023
Scott H. Seidman, Ph.D. University of Rochester work Box 603 601 Elmwood Ave Rochester NY 14642 office: MC 6-8543 p 585-273-2122 f 585-756-5334

Role of Gravito-Inertial Force Vector in the Generation of Perception and Eye Movements

The vestibular system, through the vestibulo-ocular reflex (VOR) maintains vision during angular (aVOR) and linear (lVOR) head movements by producing compensatory eye movements that stabilize images on the retina. The vestibular system also contributes to the perception of motion and orientation in space. Since the otolith organs serve as the body's transducers for acceleration information, they respond similarly to tilt with respect to gravity and linear accelerations experienced during translational motion, and are thus ambiguous. Failure to accurately differentiate these two types of stimuli (i.e., resolve the ambiguity) would lead to failures in behavior such as gaze perturbations, falls, and inappropriate percepts of movement and associated reflex responses (such as compelling perceptions of translational acceleration when lying prone, for example). I am investigating how the vestibular system differentiates these two forms of acceleration in both perceptual and eye movement systems in humans.

Eye movement and perception data strongly suggest that frequency content plays an important role in resolving inherent otolith ambiguities, thus I have developed a number of robust techniques for assessing perceptions of tilt and translational motion from frequencies down to DC, to as high up as 4 Hz. Eye movement data suggests that the vestibular system does not perfectly resolve the three dimensional Gravito-Inertial Force (GIF) vector, but rather simplifies its problem by making lower dimensional approximations, such as simple utricular projections of the GIF to assess inter-aural translational accelerations. To precisely determine exactly what simplifications are being made, both for VOR and perceptual systems, I am assessing both translational and tilt eye movement and perceptual responses in human subjects in a variety of postural orientations, and quantifying them in all response planes. For example, with a subject upright, accelerations in the inter-aural plane can reflect either horizontal translations, or given the lower dimensional approximations described above, tilt in roll. With the subject supine, however, this lower dimensional approximation is no longer valid, and the three dimensional GIF vector associated with such accelerations would actually correspond to tilt in yaw. Early results indicate that the VOR and perceptual mechanisms do not incorporate subject orientation, but continue to use low-dimensional approximations instead of correctly resolving the GIF, despite the fact such approximations do not provide directionally correct responses for all head orientations; thus, the vestibular system appears to operate under the assumption that the head is upright.

Current efforts involve stimulation of the otolith organs in isolation, without corresponding stimulus to the head’s sensors of rotational motion, the semicircular canals. For future efforts, for a limited subset of postural orientation, the responses outlined will be assessed during dynamic true tilt, instead of during isolated linear accelerations used to date. In additional, experiments to date indicate that the vestibular responses assessed seem to operate under the assumption that the subjects head is upright, and this has profound implications on postural stability.

« back to all projects