Multisensory space representation in macaque ventral intraparietal area
A. Schlack, S.J. Sterbing-D'Angelo, K. Hartung, K.-P. Hoffmann & F. Bremmer
Journal of Neuroscience 25:4616-4625, 2005
- Animals can use different sensory signals to localize objects in the environment. Depending on the situation,
the brain either integrates information from multiple sensory sources or it chooses the modality conveying the
most reliable information to direct behavior. This suggests that somehow, the brain has access to a modality-invariant
representation of external space. Accordingly, neural structures encoding signals from more than one sensory modality
are best suited for spatial information processing. In primates, the posterior parietal cortex (PPC) is a key structure
for spatial representations. One substructure within human and macaque PPC is the ventral intraparietal area (VIP),
known to represent visual, vestibular, and tactile signals. In the present study, we show for the first time that
macaque area VIP neurons also respond to auditory stimulation. Interestingly, the strength of the responses to
the acoustic stimuli greatly depended on the spatial location of the stimuli [i.e., most of the auditory responsive
neurons had surprisingly small spatially restricted auditory receptive fields (RFs)]. Given this finding, we compared
the auditory RF locations with the respective visual RF locations of individual area VIP neurons. In the vast majority
of neurons, the auditory and visual RFs largely overlapped. Additionally, neurons with well aligned visual and
auditory receptive fields tended to encode multisensory space in a common reference frame. This suggests that area
VIP constitutes a part of a neuronal circuit involved in the computation of a modality-invariant representation
of external space.
|