A dynamic neural resource model bridges sensory and working memory
Published version
Peer-reviewed
Repository URI
Repository DOI
Type
Change log
Authors
Abstract
Probing memory of a complex visual image within a few hundred milliseconds after its disappearance reveals significantly greater fidelity of recall than if the probe is delayed by as little as a second. Classically interpreted, the former taps into a detailed but rapidly decaying visual sensory or ``iconic'' memory (IM), while the latter relies on capacity-limited but comparatively stable visual working memory (VWM). While iconic decay and VWM capacity have been extensively studied independently, currently no single framework quantitatively accounts for the dynamics of memory fidelity over these timescales. Here we extend a stationary neural population model of VWM with a temporal dimension, incorporating rapid sensory-driven accumulation of activity encoding each visual feature in memory, and a slower accumulation of internal error that causes memorized features to randomly drift over time. Instead of facilitating read-out from an independent sensory store, an early cue benefits recall by lifting the effective limit on VWM signal strength imposed when multiple items compete for representation, allowing memory for the cued item to be supplemented with information from the decaying sensory trace. Empirical measurements of human recall dynamics validate these predictions while excluding alternative model architectures.
Description
Peer reviewed: True
Acknowledgements: We thank George Sperling and Sebastian Schneegans for helpful discussion, Robert Taylor for help with Bayesian hierarchical modeling, and Jessica McMaster for help with data collection. We used resources provided by the Cambridge Service for Data Driven Discovery (CSD3) operated by the University of Cambridge Research Computing Service. This research was supported by the Wellcome Trust (grant 106926 to PMB).
Keywords
Journal Title
Conference Name
Journal ISSN
2050-084X