Digitala Vetenskapliga Arkivet

Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Bayesian structure learning in graphical models
KTH, Skolan för teknikvetenskap (SCI), Matematik (Inst.), Matematisk statistik.ORCID-id: 0000-0002-6886-5436
2016 (Engelska)Licentiatavhandling, sammanläggning (Övrigt vetenskapligt)
Abstract [en]

This thesis consists of two papers studying structure learning in probabilistic graphical models for both undirected graphs anddirected acyclic graphs (DAGs).

Paper A, presents a novel family of graph theoretical algorithms, called the junction tree expanders, that incrementally construct junction trees for decomposable graphs. Due to its Markovian property, the junction tree expanders are shown to be suitable for proposal kernels in a sequential Monte Carlo (SMC) sampling scheme for approximating a graph posterior distribution. A simulation study is performed for the case of Gaussian decomposable graphical models showing efficiency of the suggested unified approach for both structural and parametric Bayesian inference.

Paper B, develops a novel prior distribution over DAGs with the ability to express prior knowledge in terms of graph layerings. In conjunction with the prior, a search and score algorithm based on the layering property of DAGs, is developed for performing structure learning in Bayesian networks. A simulation study shows that the search and score algorithm along with the prior has superior performance for learning graph with a clearly layered structure compared with other priors.

Ort, förlag, år, upplaga, sidor
Stockholm: KTH Royal Institute of Technology, 2016. , s. viii, 19
Serie
TRITA-MAT-A ; 2015:16
Nyckelord [en]
Bayesian statistics, graphical models, Bayesian networks, Markov networks, structure learning
Nationell ämneskategori
Sannolikhetsteori och statistik
Forskningsämne
Tillämpad matematik och beräkningsmatematik
Identifikatorer
URN: urn:nbn:se:kth:diva-179852ISBN: 978-91-7595-832-3 (tryckt)OAI: oai:DiVA.org:kth-179852DiVA, id: diva2:892063
Presentation
2016-01-28, Rum 3418, Instititionen för matematik, Lindstedtsvägen 25, Kungliga Tekniska Högskolan, Stockholm, 14:00 (Engelska)
Opponent
Handledare
Anmärkning

QC 20160111

Tillgänglig från: 2016-01-11 Skapad: 2016-01-04 Senast uppdaterad: 2022-06-23Bibliografiskt granskad
Delarbeten
1. Bayesian structure learning in graphical models using sequential Monte Carlo
Öppna denna publikation i ny flik eller fönster >>Bayesian structure learning in graphical models using sequential Monte Carlo
(Engelska)Manuskript (preprint) (Övrigt vetenskapligt)
Abstract [en]

In this paper we present a family of algorithms, the junction tree expanders, for expanding junction trees in the sense that the number of nodes in the underlying decomposable graph is increased by one. The family of junction tree expanders is equipped with a number of theoretical results including a characterization stating that every junction tree and consequently every de- composable graph can be constructed by iteratively using a junction tree expander. Further, an important feature of a stochastic implementation of a junction tree expander is the Markovian property inherent to the tree propagation dynamics. Using this property, a sequential Monte Carlo algorithm for approximating a probability distribution defined on the space of decompos- able graphs is developed with the junction tree expander as a proposal kernel. Specifically, we apply the sequential Monte Carlo algorithm for structure learning in decomposable Gaussian graphical models where the target distribution is a junction tree posterior distribution. In this setting, posterior parametric inference on the underlying decomposable graph is a direct by- product of the suggested methodology; working with the G-Wishart family of conjugate priors, we derive a closed form expression for the Bayesian estimator of the precision matrix of Gaus- sian graphical models Markov with respect to a decomposable graph. Performance accuracy of the graph and parameter estimators are illustrated through a collection of numerical examples demonstrating the feasibility of the suggested approach in high-dimensional domains. 

Nyckelord
Structure learning, Bayesian statistics, Gaussian graphical models
Nationell ämneskategori
Sannolikhetsteori och statistik
Forskningsämne
Matematik
Identifikatorer
urn:nbn:se:kth:diva-180326 (URN)
Anmärkning

QC 20160524

Tillgänglig från: 2016-01-11 Skapad: 2016-01-11 Senast uppdaterad: 2022-06-23Bibliografiskt granskad
2. The Minimal Hoppe-Beta Prior Distribution for Directed Acyclic Graphs and Structure Learning
Öppna denna publikation i ny flik eller fönster >>The Minimal Hoppe-Beta Prior Distribution for Directed Acyclic Graphs and Structure Learning
(Engelska)Manuskript (preprint) (Övrigt vetenskapligt)
Abstract [en]

The main contribution of this article is a new prior distribution over directed acyclic graphs intended for structured Bayesian networks, where the structure is given by an ordered block model. That is, the nodes of the graph are objects which fall into categories or blocks; the blocks have a natural ordering or ranking. The presence of a relationship between two objects is denoted by a directed edge, from the object of category of lower rank to the object of higher rank. The models considered here were introduced in Kemp et al. [7] for relational data and extended to multivariate data in Mansinghka et al. [12].

We consider the situation where the nodes of the graph represent random variables, whose joint probability distribution factorises along the DAG. We use a minimal layering of the DAG to express the prior. We describe Monte Carlo schemes, with a similar generative that was used for prior, for finding the optimal a posteriori structure given a data matrix and compare the performance with Mansinghka et al. and also with the uniform prior. 

Nyckelord
Graphical models, Bayesian networks, structure learning, DAG prior
Nationell ämneskategori
Sannolikhetsteori och statistik
Forskningsämne
Tillämpad matematik och beräkningsmatematik
Identifikatorer
urn:nbn:se:kth:diva-180327 (URN)
Anmärkning

QC 20160524

Tillgänglig från: 2016-01-11 Skapad: 2016-01-11 Senast uppdaterad: 2022-06-23Bibliografiskt granskad

Open Access i DiVA

Thesis(643 kB)914 nedladdningar
Filinformation
Filnamn FULLTEXT02.pdfFilstorlek 643 kBChecksumma SHA-512
5ee0328008fd69056db2d365b19c96783404b1599c4b102b693ab2bcd299697a0cf8997c219860b4a2a7b300306eac9549858ac87746f144d077f752425bb83d
Typ fulltextMimetyp application/pdf

Sök vidare i DiVA

Av författaren/redaktören
Rios, Felix Leopoldo
Av organisationen
Matematisk statistik
Sannolikhetsteori och statistik

Sök vidare utanför DiVA

GoogleGoogle Scholar
Totalt: 917 nedladdningar
Antalet nedladdningar är summan av nedladdningar för alla fulltexter. Det kan inkludera t.ex tidigare versioner som nu inte längre är tillgängliga.

isbn
urn-nbn

Altmetricpoäng

isbn
urn-nbn
Totalt: 1203 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf