Evidence for a deep, distributed and dynamic code for animacy in human ventral anterior temporal cortex.
eLife Sciences Publications, Ltd
MetadataShow full item record
Rogers, T. T., Cox, C. R., Lu, Q., Shimotake, A., Kikuchi, T., Kunieda, T., Miyamoto, S., et al. (2021). Evidence for a deep, distributed and dynamic code for animacy in human ventral anterior temporal cortex.. Elife, 10 https://doi.org/10.7554/eLife.66276
Funder: European Research Council; FundRef: http://dx.doi.org/10.13039/501100000781; Grant(s): GAP: 502670428 - BRAIN2MIND_NEUROCOMP
How does the human brain encode semantic information about objects? This paper reconciles two seemingly contradictory views. The first proposes that local neural populations independently encode semantic features; the second, that semantic representations arise as a dynamic distributed code that changes radically with stimulus processing. Combining simulations with a well-known neural network model of semantic memory, multivariate pattern classification, and human electrocorticography, we find that both views are partially correct: information about the animacy of a depicted stimulus is distributed across ventral temporal cortex in a dynamic code possessing feature-like elements posteriorly but with elements that change rapidly and nonlinearly in anterior regions. This pattern is consistent with the view that anterior temporal lobes serve as a deep cross-modal 'hub' in an interactive semantic network, and more generally suggests that tertiary association cortices may adopt dynamic distributed codes difficult to detect with common brain imaging methods.
Research Article, Neuroscience, semantic memory, cognition, neural networks, ECOG, temporal lobe, mvpa, Human
Medical Research Council (MR/R023883/1)
External DOI: https://doi.org/10.7554/eLife.66276
This record's URL: https://www.repository.cam.ac.uk/handle/1810/329995