Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Analogical mapping and inference with binary spatter codes and sparse distributed memory
Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
La Trobe University.
Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.ORCID iD: 0000-0001-5662-825X
2013 (English)In: The 2013 International Joint Conference on Neural Networks (IJCNN): Dallas, Texas 4-9 Aug 2013, Piscataway, NJ: IEEE Communications Society, 2013, p. 1-8Conference paper, Published paper (Refereed)
Abstract [en]

Analogy-making is a key function of human cognition. Therefore, the development of computational models of analogy that automatically learn from examples can lead to significant advances in cognitive systems. Analogies require complex, relational representations of learned structures, which is challenging for both symbolic and neurally inspired models. Vector symbolic architectures (VSAs) are a class of connectionist models for the representation and manipulation of compositional structures, which can be used to model analogy. We study a novel VSA network for the analogical mapping of compositional structures, which integrates an associative memory known as sparse distributed memory (SDM). The SDM enables non-commutative binding of compositional structures, which makes it possible to predict novel patterns in sequences. To demonstrate this property we apply the network to a commonly used intelligence test called Raven’s Progressive Matrices. We present results of simulation experiments for the Raven’s task and calculate the probability of prediction error at 95% confidence level. We find that non-commutative binding requires sparse activation of the SDM and that 10–20% concept-specific activation of neurons is optimal. The optimal dimensionality of the binary distributed representations of the VSA is of the order 10^4, which is comparable with former results and the average synapse count of neurons in the cerebral cortex.

Place, publisher, year, edition, pages
Piscataway, NJ: IEEE Communications Society, 2013. p. 1-8
Series
Proceedings of ... International Joint Conference on Neural Networks, ISSN 2161-4393
National Category
Other Electrical Engineering, Electronic Engineering, Information Engineering
Research subject
Industrial Electronics
Identifiers
URN: urn:nbn:se:ltu:diva-40730Local ID: ff850dbb-d6f7-426b-844b-2e2a6fe4303cISBN: 978-1-4673-6128-6 (print)OAI: oai:DiVA.org:ltu-40730DiVA, id: diva2:1014251
Conference
International Joint Conference on Neural Networks : 04/08/2013 - 09/08/2013
Note
Godkänd; 2013; 20130408 (bleemr)Available from: 2016-10-03 Created: 2016-10-03 Last updated: 2018-05-04Bibliographically approved

Open Access in DiVA

fulltext(1003 kB)83 downloads
File information
File name FULLTEXT01.pdfFile size 1003 kBChecksum SHA-512
0f76b2ecb8bf7affa9b00929f62ab8a12a3e1c70bad57cb3b3fe299dbff02da78a6939805dd1ea6c73c650c28d7535b975e2bf17b3a27fb2b6a072782d51e52f
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Emruli, BlerimSandin, Fredrik
By organisation
Embedded Internet Systems Lab
Other Electrical Engineering, Electronic Engineering, Information Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 83 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 54 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf