Change search
ReferencesLink to record
Permanent link

Direct link
Encoding Sequential Information in Vector Space Models of Semantics: Comparing Holographic Reduced Representation and Random Permutation
Number of Authors: 4
2010 (English)Conference paper (Refereed)
Abstract [en]

Encoding information about the order in which words typically appear has been shown to improve the performance of high-dimensional semantic space models. This requires an encoding operation capable of binding together vectors in an order-sensitive way, and efficient enough to scale to large text corpora. Although both circular convolution and random permutations have been enlisted for this purpose in semantic models, these operations have never been systematically compared. In Experiment 1 we compare their storage capacity and probability of correct retrieval; in Experiments 2 and 3 we compare their performance on semantic tasks when integrated into existing models. We conclude that random permutations are a scalable alternative to circular convolution with several desirable properties.

Place, publisher, year, edition, pages
2010, 11. 865-870 p.
National Category
Computer and Information Science
URN: urn:nbn:se:ri:diva-16051OAI: diva2:1038075
Proceedings of the 32nd Annual Cognitive Science Society
Available from: 2016-10-18 Created: 2016-10-18

Open Access in DiVA

fulltext(249 kB)4 downloads
File information
File name FULLTEXT01.pdfFile size 249 kBChecksum SHA-512
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Sahlgren, Magnus
Computer and Information Science

Search outside of DiVA

GoogleGoogle Scholar
Total: 4 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

ReferencesLink to record
Permanent link

Direct link