BRADFORD SCHOLARS

    • Sign in
    View Item 
    •   Bradford Scholars
    • Life Sciences
    • Life Sciences Publications
    • View Item
    •   Bradford Scholars
    • Life Sciences
    • Life Sciences Publications
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of Bradford ScholarsCommunitiesAuthorsTitlesSubjectsPublication DateThis CollectionAuthorsTitlesSubjectsPublication Date

    My Account

    Sign in

    HELP

    Bradford Scholars FAQsCopyright Fact SheetPolicies Fact SheetDeposit Terms and ConditionsDigital Preservation Policy

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Asynchrony adaptation reveals neural population code for audio-visual timing

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    Publication date
    2011
    Author
    Roach, N.W.
    Heron, James
    Whitaker, David J.
    McGraw, Paul V.
    Keyword
    Adaptation
    ; Algorithms
    ; Auditory perception
    ; Brain
    ; Humans
    ; Models
    ; Neurons
    ; Visual perception
    ; REF 2014
    Peer-Reviewed
    Yes
    
    Metadata
    Show full item record
    Abstract
    The relative timing of auditory and visual stimuli is a critical cue for determining whether sensory signals relate to a common source and for making inferences about causality. However, the way in which the brain represents temporal relationships remains poorly understood. Recent studies indicate that our perception of multisensory timing is flexible--adaptation to a regular inter-modal delay alters the point at which subsequent stimuli are judged to be simultaneous. Here, we measure the effect of audio-visual asynchrony adaptation on the perception of a wide range of sub-second temporal relationships. We find distinctive patterns of induced biases that are inconsistent with the previous explanations based on changes in perceptual latency. Instead, our results can be well accounted for by a neural population coding model in which: (i) relative audio-visual timing is represented by the distributed activity across a relatively small number of neurons tuned to different delays; (ii) the algorithm for reading out this population code is efficient, but subject to biases owing to under-sampling; and (iii) the effect of adaptation is to modify neuronal response gain. These results suggest that multisensory timing information is represented by a dedicated population code and that shifts in perceived simultaneity following asynchrony adaptation arise from analogous neural processes to well-known perceptual after-effects.
    URI
    http://hdl.handle.net/10454/6154
    Version
    No full-text in the repository
    Citation
    Roach NW, Heron J, Whitaker DJ et al (2011) Asynchrony adaptation reveals neural population code for audio-visual timing. Proceedings: Biological Sciences. 278(1710): 1314-22.
    Link to publisher’s version
    http://dx.doi.org/10.1098/rspb.2010.1737
    Type
    Article
    Collections
    Life Sciences Publications

    entitlement

     
    DSpace software (copyright © 2002 - 2023)  DuraSpace
    Quick Guide | Contact Us
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.