Asynchrony adaptation reveals neural population code for audio-visual timing
Publication date
2011Peer-Reviewed
YesOpen Access status
closedAccess
Metadata
Show full item recordAbstract
The relative timing of auditory and visual stimuli is a critical cue for determining whether sensory signals relate to a common source and for making inferences about causality. However, the way in which the brain represents temporal relationships remains poorly understood. Recent studies indicate that our perception of multisensory timing is flexible--adaptation to a regular inter-modal delay alters the point at which subsequent stimuli are judged to be simultaneous. Here, we measure the effect of audio-visual asynchrony adaptation on the perception of a wide range of sub-second temporal relationships. We find distinctive patterns of induced biases that are inconsistent with the previous explanations based on changes in perceptual latency. Instead, our results can be well accounted for by a neural population coding model in which: (i) relative audio-visual timing is represented by the distributed activity across a relatively small number of neurons tuned to different delays; (ii) the algorithm for reading out this population code is efficient, but subject to biases owing to under-sampling; and (iii) the effect of adaptation is to modify neuronal response gain. These results suggest that multisensory timing information is represented by a dedicated population code and that shifts in perceived simultaneity following asynchrony adaptation arise from analogous neural processes to well-known perceptual after-effects.Version
No full-text in the repositoryCitation
Roach NW, Heron J, Whitaker DJ et al (2011) Asynchrony adaptation reveals neural population code for audio-visual timing. Proceedings: Biological Sciences. 278(1710): 1314-22.Link to Version of Record
https://doi.org/10.1098/rspb.2010.1737Type
Articleae974a485f413a2113503eed53cd6c53
https://doi.org/10.1098/rspb.2010.1737