Skip to main content
Explore URMC


URMC / Labs / Freedman Lab / Projects / Neural Control of Coordinated Movements

Neural Control of Coordinated Movements

Time Lapsed Photo of Movement Experiment Output

Figure 1. A time-lapse photograph
illustrating a variety of trajectories
produced using our 2-axis motorized
gimbals and projected onto the inside
of the Dome.

Orienting to objects in our environment is a ubiquitous behavior that enables us to extract high quality visual information about our world, helps us navigate through complex environments and contributes to our daily interactions. Understanding the neural mechanisms that form the basis for this common activity requires development of ways in which to separate individual movement components in order to correlate neural activity with some but not other features, and also a recognition that naturally occurring movements are made within a broader context.

My lab has developed techniques that permit dissociation of eye and head movements during visual orienting to static visual targets. And now we are adding the capacity to present moving targets with variable trajectories (Fig.1) and project these onto the inside surface of a 2.5m dome (Fig. 2). We are using these systems to enhance our ongoing investigations into of the neural control of head unrestrained pursuit movements, combinations of pursuit and gaze shift tasks and the capacity to allow orienting of a visual target using a manually controlled joystick.

Drawing of the Experimental Dome

Figure 2. A schematic drawing of the 2.5 m diameter dome
placed inside the 2m cube housing 3 pairs of Helmoltz field
coils for eye and head movement measurements.

« back to all projects