BRADFORD SCHOLARS

    • Sign in
    View Item 
    •   Bradford Scholars
    • Life Sciences
    • Life Sciences Publications
    • View Item
    •   Bradford Scholars
    • Life Sciences
    • Life Sciences Publications
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of Bradford ScholarsCommunitiesAuthorsTitlesSubjectsPublication DateThis CollectionAuthorsTitlesSubjectsPublication Date

    My Account

    Sign in

    HELP

    Bradford Scholars FAQsCopyright Fact SheetPolicies Fact SheetDeposit Terms and ConditionsDigital Preservation Policy

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Sensory memory is allocated exclusively to the current event-segment

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    View/Open
    tripathy_et_al_2018.pdf (1.296Mb)
    Download
    Publication date
    2018-09-07
    Author
    Tripathy, Srimant P.
    Ögmen, H.
    Keyword
    Event segmentation
    Iconic memory
    Memory
    Modal model of memory
    Multiple-object tracking
    Sensory memory
    Short-term memory
    Tracking
    Rights
    (c) 2018 The Authors. This is an Open Access article distributed under the Creative Commons CC-BY license (http://creativecommons.org/licenses/by/4.0/)
    Peer-Reviewed
    Yes
    
    Metadata
    Show full item record
    Abstract
    The Atkinson-Shiffrin modal model forms the foundation of our understanding of human memory. It consists of three stores (Sensory Memory (SM), also called iconic memory, Short-Term Memory (STM), and Long-Term Memory (LTM)), each tuned to a different time-scale. Since its inception, the STM and LTM components of the modal model have undergone significant modifications, while SM has remained largely unchanged, representing a large capacity system funneling information into STM. In the laboratory, visual memory is usually tested by presenting a brief static stimulus and, after a delay, asking observers to report some aspect of the stimulus. However, under ecological viewing conditions, our visual system receives a continuous stream of inputs, which is segmented into distinct spatio-temporal segments, called events. Events are further segmented into event-segments. Here we show that SM is not an unspecific general funnel to STM but is allocated exclusively to the current event-segment. We used a Multiple-Object Tracking (MOT) paradigm in which observers were presented with disks moving in different directions, along bi-linear trajectories, i.e., linear trajectories, with a single deviation in direction at the mid-point of each trajectory. The synchronized deviation of all of the trajectories produced an event stimulus consisting of two event-segments. Observers reported the pre-deviation or the post-deviation directions of the trajectories. By analyzing observers' responses in partial- and full-report conditions, we investigated the involvement of SM for the two event-segments. The hallmarks of SM hold only for the current event segment. As the large capacity SM stores only items involved in the current event-segment, the need for event-tagging in SM is eliminated, speeding up processing in active vision. By characterizing how memory systems are interfaced with ecological events, this new model extends the Atkinson-Shiffrin model by specifying how events are stored in the first stage of multi-store memory systems.
    URI
    http://hdl.handle.net/10454/16722
    Version
    Published version
    Citation
    Tripathy SP and Ögmen H (2018) Sensory memory is allocated exclusively to the current event-segment. Frontiers in Psychology. 9: 1435.
    Link to publisher’s version
    https://doi.org/10.3389/fpsyg.2018.01435
    Type
    Article
    Collections
    Life Sciences Publications

    entitlement

     

    Related items

    Showing items related by title, author, creator and subject.

    • Thumbnail

      Bias effects of short- and long-term color memory for unique objects

      Bloj, Marina; Weiß, D.; Gegenfurtner, K.R. (2016-03-09)
      Are objects remembered with a more saturated color? Some of the evidence supporting this statement comes from research using “memory colors”—the typical colors of particular objects, for example, the green of grass. The problematic aspect of these findings is that many different exemplars exist, some of which might exhibit a higher saturation than the one measured by the experimenter. Here we avoid this problem by using unique personal items and comparing long- and short-term color memory matches (in hue, value, and chroma) with those obtained with the object present. Our results, on average, confirm that objects are remembered as more saturated than they are.
    • Thumbnail

      The reference frame for encoding and retention of motion depends on stimulus set size

      Huynh, D.L.; Tripathy, Srimant P.; Bedell, H.E.; Ogmen, Haluk (2017-04)
      The goal of this study was to investigate the reference frames used in perceptual encoding and storage of visual motion information. In our experiments, observers viewed multiple moving objects and reported the direction of motion of a randomly selected item. Using a vector-decomposition technique, we computed performance during smooth pursuit with respect to a spatiotopic (nonretinotopic) and to a retinotopic component and compared them with performance during fixation, which served as the baseline. For the stimulus encoding stage, which precedes memory, we found that the reference frame depends on the stimulus set size. For a single moving target, the spatiotopic reference frame had the most significant contribution with some additional contribution from the retinotopic reference frame. When the number of items increased (Set Sizes 3 to 7), the spatiotopic reference frame was able to account for the performance. Finally, when the number of items became larger than 7, the distinction between reference frames vanished. We interpret this finding as a switch to a more abstract nonmetric encoding of motion direction. We found that the retinotopic reference frame was not used in memory. Taken together with other studies, our results suggest that, whereas a retinotopic reference frame may be employed for controlling eye movements, perception and memory use primarily nonretinotopic reference frames. Furthermore, the use of nonretinotopic reference frames appears to be capacity limited. In the case of complex stimuli, the visual system may use perceptual grouping in order to simplify the complexity of stimuli or resort to a nonmetric abstract coding of motion information.
    • Thumbnail

      True and intentionally fabricated memories

      Justice, L.V.; Morrison, Catriona M.; Conway, M.A. (2013)
      The aim of the experiment reported here was to investigate the processes underlying the construction of truthful and deliberately fabricated memories. Properties of memories created to be intentionally false (fabricated memories) were compared to properties of memories believed to be true (true memories). Participants recalled and then wrote or spoke true memories and fabricated memories of everyday events. It was found that true memories were reliably more vivid than fabricated memories and were nearly always recalled from a first-person perspective. In contrast, fabricated differed from true memories in that they were judged to be reliably older, were more frequently recalled from a third-person perspective, and linguistic analysis revealed that they required more cognitive effort to generate. No notable differences were found across modality of reporting. Finally, it was found that intentionally fabricated memories were created by recalling and then “editing” true memories. Overall, these findings show that true and fabricated memories systematically differ, despite the fact that both are based on true memories.
    DSpace software (copyright © 2002 - 2023)  DuraSpace
    Quick Guide | Contact Us
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.