Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Synaptic and nonsynaptic plasticity approximating probabilistic inference
KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. Stockholm Brain Institute, Stockholm; University of Edinburgh.ORCID iD: 0000-0001-8796-3237
University of Edinburgh.
KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. Stockholm Brain Institute, Stockholm; Stockholm University.ORCID iD: 0000-0002-2358-7815
2014 (English)In: Frontiers in Synaptic Neuroscience, ISSN 1663-3563, Vol. 6, no APR, 8Article in journal (Refereed) Published
Abstract [en]

Learning and memory operations in neural circuits are believed to involve molecular cascades of synaptic and nonsynaptic changes that lead to a diverse repertoire of dynamical phenomena at higher levels of processing. Hebbian and homeostatic plasticity, neuromodulation, and intrinsic excitability all conspire to form and maintain memories. But it is still unclear how these seemingly redundant mechanisms could jointly orchestrate learning in a more unified system. To this end, a Hebbian learning rule for spiking neurons inspired by Bayesian statistics is proposed. In this model, synaptic weights and intrinsic currents are adapted on-line upon arrival of single spikes, which initiate a cascade of temporally interacting memory traces that locally estimate probabilities associated with relative neuronal activation levels. Trace dynamics enable synaptic learning to readily demonstrate a spike-timing dependence, stably return to a set-point over long time scales, and remain competitive despite this stability. Beyond unsupervised learning, linking the traces with an external plasticity-modulating signal enables spike-based reinforcement learning. At the postsynaptic neuron, the traces are represented by an activity-dependent ion channel that is shown to regulate the input received by a postsynaptic cell and generate intrinsic graded persistent firing levels. We show how spike-based Hebbian-Bayesian learning can be performed in a simulated inference task using integrate-and-fire (IAF) neurons that are Poisson-firing and background-driven, similar to the preferred regime of cortical neurons. Our results support the view that neurons can represent information in the form of probability distributions, and that probabilistic inference could be a functional by-product of coupled synaptic and nonsynaptic mechanisms operating over several timescales. The model provides a biophysical realization of Bayesian computation by reconciling several observed neural phenomena whose functional effects are only partially understood in concert.

Place, publisher, year, edition, pages
2014. Vol. 6, no APR, 8
Keyword [en]
Bayes' rule, synaptic plasticity and memory modeling, intrinsic excitability, naïve Bayes classifier, spiking neural networks, Hebbian learning
National Category
Natural Sciences Medical and Health Sciences
Research subject
Computer Science; Theoretical Chemistry and Biology; Biological Physics
Identifiers
URN: urn:nbn:se:kth:diva-165806DOI: 10.3389/fnsyn.2014.00008Scopus ID: 2-s2.0-84904738936OAI: oai:DiVA.org:kth-165806DiVA: diva2:808843
Projects
BrainScaleSErasmus Mundus EuroSPIN
Funder
VINNOVAEU, FP7, Seventh Framework Programme, EU-FP7-FET-269921Swedish National Infrastructure for Computing (SNIC)Swedish Research Council, VR-621-2009-3807
Note

This Document is Protected by copyright and was first published by Frontiers. All rights reserved. it is reproduced with permission.

QC 20150430

Available from: 2015-04-29 Created: 2015-04-29 Last updated: 2017-12-04Bibliographically approved
In thesis
1. Spike-Based Bayesian-Hebbian Learning in Cortical and Subcortical Microcircuits
Open this publication in new window or tab >>Spike-Based Bayesian-Hebbian Learning in Cortical and Subcortical Microcircuits
2017 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Cortical and subcortical microcircuits are continuously modified throughout life. Despite ongoing changes these networks stubbornly maintain their functions, which persist although destabilizing synaptic and nonsynaptic mechanisms should ostensibly propel them towards runaway excitation or quiescence. What dynamical phenomena exist to act together to balance such learning with information processing? What types of activity patterns

do they underpin, and how do these patterns relate to our perceptual experiences? What enables learning and memory operations to occur despite such massive and constant neural reorganization? Progress towards answering many of these questions can be pursued through large-scale neuronal simulations. 

 

In this thesis, a Hebbian learning rule for spiking neurons inspired by statistical inference is introduced. The spike-based version of the Bayesian Confidence Propagation Neural Network (BCPNN) learning rule involves changes in both synaptic strengths and intrinsic neuronal currents. The model is motivated by molecular cascades whose functional outcomes are mapped onto biological mechanisms such as Hebbian and homeostatic plasticity, neuromodulation, and intrinsic excitability. Temporally interacting memory traces enable spike-timing dependence, a stable learning regime that remains competitive, postsynaptic activity regulation, spike-based reinforcement learning and intrinsic graded persistent firing levels. 

 

The thesis seeks to demonstrate how multiple interacting plasticity mechanisms can coordinate reinforcement, auto- and hetero-associative learning within large-scale, spiking, plastic neuronal networks. Spiking neural networks can represent information in the form of probability distributions, and a biophysical realization of Bayesian computation can help reconcile disparate experimental observations.

Place, publisher, year, edition, pages
Stockholm: KTH Royal Institute of Technology, 2017. 89 p.
Series
TRITA-CSC-A, ISSN 1653-5723 ; 2017:11
Keyword
Bayes' rule, synaptic plasticity and memory modeling, intrinsic excitability, naïve Bayes classifier, spiking neural networks, Hebbian learning, neuromorphic engineering, reinforcement learning, temporal sequence learning, attractor network
National Category
Computer Systems
Research subject
Computer Science
Identifiers
urn:nbn:se:kth:diva-205568 (URN)978-91-7729-351-4 (ISBN)
Public defence
2017-05-09, F3, Lindstedtsvägen 26, Stockholm, 13:00 (English)
Opponent
Supervisors
Note

QC 20170421

Available from: 2017-04-21 Created: 2017-04-19 Last updated: 2017-04-21Bibliographically approved

Open Access in DiVA

fulltext(8025 kB)87 downloads
File information
File name FULLTEXT01.pdfFile size 8025 kBChecksum SHA-512
bd7454967a49e2d33a56db0d07e34a7053ea9e4701942126a3c7420c797442a1755b69df2cef4ebb4f2722ee911939673e9b1d4e50170586e2bb0ce64b256a0a
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopushttp://journal.frontiersin.org/article/10.3389/fnsyn.2014.00008/abstract

Search in DiVA

By author/editor
Tully, PhilipLansner, Anders
By organisation
Computational Biology, CB
In the same journal
Frontiers in Synaptic Neuroscience
Natural SciencesMedical and Health Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 87 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 289 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf