Repository logo
 

Encoding sequential information in semantic space models: comparing holographic reduced representation and random permutation.


Change log

Authors

Recchia, Gabriel 
Sahlgren, Magnus 
Kanerva, Pentti 
Jones, Michael N 

Abstract

Circular convolution and random permutation have each been proposed as neurally plausible binding operators capable of encoding sequential information in semantic memory. We perform several controlled comparisons of circular convolution and random permutation as means of encoding paired associates as well as encoding sequential information. Random permutations outperformed convolution with respect to the number of paired associates that can be reliably stored in a single memory trace. Performance was equal on semantic tasks when using a small corpus, but random permutations were ultimately capable of achieving superior performance due to their higher scalability to large corpora. Finally, "noisy" permutations in which units are mapped to other units arbitrarily (no one-to-one mapping) perform nearly as well as true permutations. These findings increase the neurological plausibility of random permutations and highlight their utility in vector space models of semantics.

Description

Keywords

Humans, Information Storage and Retrieval, Natural Language Processing, Semantics, Space Simulation, Vocabulary

Journal Title

Comput Intell Neurosci

Conference Name

Journal ISSN

1687-5265
1687-5273

Volume Title

Publisher

Hindawi Limited