Encoding Sequential Information in Semantic Space Models: Comparing Holographic Reduced Representation and Random Permutation
Jones, Michael N.
MetadataShow full item record
Recchia, G., Sahlgren, M., Kanerva, P., & Jones, M. N. (2015). Encoding Sequential Information in Semantic Space Models: Comparing Holographic Reduced Representation and Random Permutation. https://doi.org/10.1155/2015/986574
Circular convolution and random permutation have each been proposed as neurally plausible binding operators capable of encoding sequential information in semantic memory. We perform several controlled comparisons of circular convolution and random permutation as means of encoding paired associates as well as encoding sequential information. Random permutations outperformed convolution with respect to the number of paired associates that can be reliably stored in a single memory trace. Performance was equal on semantic tasks when using a small corpus, but random permutations were ultimately capable of achieving superior performance due to their higher scalability to large corpora. Finally, “noisy” permutations in which units are mapped to other units arbitrarily (no one-to-one mapping) perform nearly as well as true permutations. These findings increase the neurological plausibility of random permutations and highlight their utility in vector space models of semantics.
External DOI: https://doi.org/10.1155/2015/986574
This record's URL: https://www.repository.cam.ac.uk/handle/1810/267637
All Rights Reserved
Rights Holder: Copyright © 2015 Gabriel Recchia et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Licence URL: https://www.rioxx.net/licenses/all-rights-reserved/