Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Hamiltonian Monte Carlo with Energy Conserving Subsampling
Univ New South Wales, Australia; ARC Ctr Excellence Math and Stat Frontiers ACEMS, Australia.
ARC Ctr Excellence Math and Stat Frontiers ACEMS, Australia; Univ Technol Sydney, Australia.
Univ New South Wales, Australia; ARC Ctr Excellence Math and Stat Frontiers ACEMS, Australia.
ARC Ctr Excellence Math and Stat Frontiers ACEMS, Australia; Univ Sydney, Australia.
Vise andre og tillknytning
2019 (engelsk)Inngår i: Journal of machine learning research, ISSN 1532-4435, E-ISSN 1533-7928, Vol. 20, artikkel-id 1Artikkel i tidsskrift (Fagfellevurdert) Published
Abstract [en]

Hamiltonian Monte Carlo (HMC) samples efficiently from high-dimensional posterior distributions with proposed parameter draws obtained by iterating on a discretized version of the Hamiltonian dynamics. The iterations make HMC computationally costly, especially in problems with large data sets, since it is necessary to compute posterior densities and their derivatives with respect to the parameters. Naively computing the Hamiltonian dynamics on a subset of the data causes HMC to lose its key ability to generate distant parameter proposals with high acceptance probability. The key insight in our article is that efficient subsampling HMC for the parameters is possible if both the dynamics and the acceptance probability are computed from the same data subsample in each complete HMC iteration. We show that this is possible to do in a principled way in a HMC-within-Gibbs framework where the subsample is updated using a pseudo marginal MH step and the parameters are then updated using an HMC step, based on the current subsample. We show that our subsampling methods are fast and compare favorably to two popular sampling algorithms that use gradient estimates from data subsampling. We also explore the current limitations of subsampling HMC algorithms by varying the quality of the variance reducing control variates used in the estimators of the posterior density and its gradients.

sted, utgiver, år, opplag, sider
MIT Press, 2019. Vol. 20, artikkel-id 1
Emneord [en]
Bayesian inference; Big Data; Markov chain Monte Carlo; Estimated likelihood; Stochastic gradient Hamiltonian Monte Carlo; Stochastic Gradient Langevin Dynamics
HSV kategori
Identifikatorer
URN: urn:nbn:se:liu:diva-159295ISI: 000476621700001OAI: oai:DiVA.org:liu-159295DiVA, id: diva2:1340870
Merknad

Funding Agencies|Australian Research Council Center of Excellence grant [CE140100049]; Swedish Foundation for Strategic Research [RIT 15-0097]

Tilgjengelig fra: 2019-08-06 Laget: 2019-08-06 Sist oppdatert: 2019-11-14bibliografisk kontrollert

Open Access i DiVA

fulltext(1134 kB)5 nedlastinger
Filinformasjon
Fil FULLTEXT02.pdfFilstørrelse 1134 kBChecksum SHA-512
7e9a34b117c4e1f4641cd5db7a7b7423d761687e8b8be20c26554a23e02469841d98b7af8a388bf4d82d61c90a7595fed916c959f1c9b6b345df61ff9686aaa7
Type fulltextMimetype application/pdf

Søk i DiVA

Av forfatter/redaktør
Villani, Mattias
Av organisasjonen
I samme tidsskrift
Journal of machine learning research

Søk utenfor DiVA

GoogleGoogle Scholar
Totalt: 5 nedlastinger
Antall nedlastinger er summen av alle nedlastinger av alle fulltekster. Det kan for eksempel være tidligere versjoner som er ikke lenger tilgjengelige

urn-nbn

Altmetric

urn-nbn
Totalt: 36 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf