Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Some computational aspects of attractor memory
KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA.
2005 (English)Licentiate thesis, comprehensive summary (Other scientific)
Abstract [en]

In this thesis I present novel mechanisms for certain computational capabilities of the cerebral cortex, building on the established notion of attractor memory. A sparse binary coding network for generating efficient representation of sensory input is presented. It is demonstrated that this network model well reproduces receptive field shapes seen in primary visual cortex and that its representations are efficient with respect to storage in associative memory. I show how an autoassociative memory, augmented with dynamical synapses, can function as a general sequence learning network. I demonstrate how an abstract attractor memory system may be realized on the microcircuit level -- and how it may be analyzed using similar tools as used experimentally. I demonstrate some predictions from the hypothesis that the macroscopic connectivity of the cortex is optimized for attractor memory function. I also discuss methodological aspects of modelling in computational neuroscience.

Place, publisher, year, edition, pages
Stockholm: KTH , 2005. , p. viii, 76
Series
Trita-NA, ISSN 0348-2952 ; 0509
Keywords [en]
Datalogi, attractor memory, cerebral cortex, neural networks
Keywords [sv]
Datalogi
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:kth:diva-249ISBN: 91-7283-983-X (print)OAI: oai:DiVA.org:kth-249DiVA, id: diva2:8079
Presentation
2005-03-15, Sal E32, KTH, Lindstedtsvägen 3, Stockholm, 07:00
Opponent
Supervisors
Note
QC 20101220Available from: 2005-05-31 Created: 2005-05-31 Last updated: 2018-01-11Bibliographically approved
List of papers
1. Sequence memory with dynamical synapses
Open this publication in new window or tab >>Sequence memory with dynamical synapses
2004 (English)In: Neurocomputing, ISSN 0925-2312, E-ISSN 1872-8286, Vol. 58-60, p. 271-278Article in journal (Refereed) Published
Abstract [en]

We present an attractor model of cortical memory, capable of sequence learning. The network incorporates a dynamical synapse model and is trained using a Hebbian learning rule that operates by redistribution of synaptic efficacy. It performs sequential recall or unordered recall depending on parameters. The model reproduces data from free recall experiments in humans. Memory capacity scales with network size, storing sequences at about 0.18 bits per synapse.

Keywords
sequence learning, free recall, dynamical synapses, synaptic depression, attractor memory
National Category
Computer Sciences
Identifiers
urn:nbn:se:kth:diva-6307 (URN)10.1016/j.neucom.2004.01.055 (DOI)000222245900043 ()2-s2.0-2542437082 (Scopus ID)
Note
QC 20100916. 12th Annual Computational Neuroscience Meeting (CSN 03). Alicante, SPAIN. JUL 05-09, 2003 Available from: 2006-11-01 Created: 2006-11-01 Last updated: 2018-01-13Bibliographically approved
2. A network model for the rapid formation of binary sparse representations of sensory inputs.
Open this publication in new window or tab >>A network model for the rapid formation of binary sparse representations of sensory inputs.
(English)Manuscript (preprint) (Other academic)
National Category
Computer Sciences
Identifiers
urn:nbn:se:kth:diva-27644 (URN)
Note
QC 20101217Available from: 2010-12-17 Created: 2010-12-17 Last updated: 2018-01-12Bibliographically approved
3. Early sensory representation in cortex optimizes information content in small neural assemblies.
Open this publication in new window or tab >>Early sensory representation in cortex optimizes information content in small neural assemblies.
(English)Manuscript (preprint) (Other academic)
National Category
Computer Sciences
Identifiers
urn:nbn:se:kth:diva-27643 (URN)
Note
QC 20101217Available from: 2010-12-17 Created: 2010-12-17 Last updated: 2018-01-12Bibliographically approved
4. Attractor neural networks with patchy connectivity
Open this publication in new window or tab >>Attractor neural networks with patchy connectivity
2006 (English)In: Neurocomputing, ISSN 0925-2312, E-ISSN 1872-8286, Vol. 69, no 7-9, p. 627-633Article in journal (Refereed) Published
Abstract [en]

 The neurons in the mammalian visual cortex are arranged in columnar structures, and the synaptic contacts of the pyramidal neurons in layer II/III are clustered into patches that are sparsely distributed over the surrounding cortical surface. Here, We use an attractor neural-network model of the cortical circuitry and investigate the effects of patchy connectivity, both on the properties of the network and the attractor dynamics. An analysis of the network shows that the signal-to-noise ratio of the synaptic potential sums are improved by the patchy connectivity, which results in a higher storage capacity. This analysis is performed for both the Hopfield and Willshaw learning rules and the results are confirmed by simulation experiments.

Keywords
attractor neural network, patchy connectivity, clustered connections, neocortex, small world network, hypercolumn
National Category
Neurosciences
Identifiers
urn:nbn:se:kth:diva-6311 (URN)10.1016/j.neucom.2005.12.002 (DOI)000235797000002 ()2-s2.0-32644438075 (Scopus ID)
Note
QC 20100831. Conference: 13th European Symposium on Artificial Neural Networks (ESANN). Brugge, BELGIUM. APR, 2005Available from: 2006-11-01 Created: 2006-11-01 Last updated: 2018-01-13Bibliographically approved
5. Attractor dynamics in a modular network model of neocortex
Open this publication in new window or tab >>Attractor dynamics in a modular network model of neocortex
2006 (English)In: Network, ISSN 0954-898X, E-ISSN 1361-6536, Network: Computation in Neural Systems, Vol. 17, no 3, p. 253-276Article in journal (Refereed) Published
Abstract [en]

Starting from the hypothesis that the mammalian neocortex to a first approximation functions as an associative memory of the attractor network type, we formulate a quantitative computational model of neocortical layers 2/3. The model employs biophysically detailed multi-compartmental model neurons with conductance based synapses and includes pyramidal cells and two types of inhibitory interneurons, i.e., regular spiking non-pyramidal cells and basket cells. The simulated network has a minicolumnar as well as a hypercolumnar modular structure and we propose that minicolumns rather than single cells are the basic computational units in neocortex. The minicolumns are represented in full scale and synaptic input to the different types of model neurons is carefully matched to reproduce experimentally measured values and to allow a quantitative reproduction of single cell recordings. Several key phenomena seen experimentally in vitro and in vivo appear as emergent features of this model. It exhibits a robust and fast attractor dynamics with pattern completion and pattern rivalry and it suggests an explanation for the so-called attentional blink phenomenon. During assembly dynamics, the model faithfully reproduces several features of local UP states, as they have been experimentally observed in vitro, as well as oscillatory behavior similar to that observed in the neocortex.

Keywords
cortex, UP State, attentional blink, attractor dynamics, synchronization
National Category
Neurosciences
Identifiers
urn:nbn:se:kth:diva-6310 (URN)10.1080/09548980600774619 (DOI)000244140900003 ()2-s2.0-33845421947 (Scopus ID)
Note

QC 20150729

Available from: 2006-11-01 Created: 2006-11-01 Last updated: 2018-01-13Bibliographically approved

Open Access in DiVA

fulltext(6046 kB)491 downloads
File information
File name FULLTEXT01.pdfFile size 6046 kBChecksum SHA-1
075f540dd4e63e4892100d9c7442a16b8a12d73a2075d8273fc799b6d6ef74b7c68ea2ab
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Rehn, Martin
By organisation
Numerical Analysis and Computer Science, NADA
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 491 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 548 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf