Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Bayesian Modeling of Directional Data with Acoustic and Other Applications
KTH, School of Electrical Engineering (EES), Communication Theory.
2014 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

A direction is defined here as a multi-dimensional unit vector. Such unitvectors form directional data. Closely related to directional data are axialdata for which each direction is equivalent to the opposite direction.Directional data and axial data arise in various fields of science. In probabilisticmodeling of such data, probability distributions are needed whichcount for the structure of the space from which data samples are collected.Such distributions are known as directional distributions and axial distributions.This thesis studies the von Mises-Fisher (vMF) distribution and the(complex) Watson distribution as representatives of directional and axialdistributions.Probabilistic models of the data are defined through a set of parameters.In the Bayesian view to uncertainty, these parameters are regarded as randomvariables in the learning inference. The primary goal of this thesis is todevelop Bayesian inference for directional and axial models, more precisely,vMF and (complex) Watson distributions, and parametric mixture modelsof such distributions. The Bayesian inference is realized using a family ofoptimization methods known as variational inference. With the proposedvariational methods, the intractable Bayesian inference problem is cast asan optimization problem.The variational inference for vMF andWatson models shall open up newapplications and advance existing application domains by reducing restrictiveassumptions made by current modelling techniques. This is the centraltheme of the thesis in all studied applications. Unsupervised clustering ofgene-expression and gene-microarray data is an existing application domain,which has been further advanced in this thesis. This thesis also advancesapplication of the complex Watson models in the problem of blind sourceseparation (BSS) with acoustic applications. Specifically, it is shown thatthe restrictive assumption of prior knowledge on the true number of sourcescan be relaxed by the desirable pruning property in Bayesian learning, resultingin BSS methods which can estimate the number of sources.Furthermore, this thesis introduces a fully Bayesian recursive frameworkfor the BSS task. This is an attempt toward realization of an online BSSmethod. In order to reduce the well-known problem of permutation ambiguityin the frequency domain, the complete BSS problem is solved in one unified modeling step, combining the frequency bin-wise source estimationwith the permutation problem. To realize this, all time frames and frequencybins are connected using a first order Markov chain. The model cancapture dependencies across both time frames and frequency bins, simultaneously,using a feed-forward two-dimensional hidden Markov model (2-DHMM).

Place, publisher, year, edition, pages
Stockholm: KTH Royal Institute of Technology, 2014. , xviii, 63 p.
Series
TRITA-EE, ISSN 1653-5146 ; 2014:043
Keyword [en]
Directional statistics, directional distributions, axial distributions, von Mises-Fisher distribution, complex Watson distribution, complex Bingham distribution, Bayesian inference, probabilistic modeling, variational inference, two-dimensional hidden Markov models, Markov chain, blind source separation, frequency domain BSS, underdetermined BSS, online BSS, Bayesian recursive, gene expression data, gene-microarray data, line spectral frequency, speaker identification. ii
National Category
Engineering and Technology Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
URN: urn:nbn:se:kth:diva-153786ISBN: 978-91-7595-281-9 (print)OAI: oai:DiVA.org:kth-153786DiVA: diva2:753712
Public defence
2014-10-31, Q2, Osquldasväg 10 (02 tr), KTH, Stockholm, 09:00 (English)
Supervisors
Note

QC 20141009

Available from: 2014-10-09 Created: 2014-10-08 Last updated: 2014-10-09Bibliographically approved
List of papers
1. Bayesian Estimation of the von-Mises Fisher Mixture Model with Variational Inference
Open this publication in new window or tab >>Bayesian Estimation of the von-Mises Fisher Mixture Model with Variational Inference
2014 (English)In: IEEE Transaction on Pattern Analysis and Machine Intelligence, ISSN 0162-8828, E-ISSN 1939-3539, Vol. 36, no 9, 1701-1715 p.Article in journal (Refereed) Published
Abstract [en]

This paper addresses the Bayesian estimation of the von-Mises Fisher (vMF) mixture model with variational inference (VI). The learning task in VI consists of optimization of the variational posterior distribution. However, the exact solution by VI does not lead to an analytically tractable solution due to the evaluation of intractable moments involving functional forms of the Bessel function in their arguments. To derive a closed-form solution, we further lower bound the evidence lower bound where the bound is tight at one point in the parameter distribution. While having the value of the bound guaranteed to increase during maximization, we derive an analytically tractable approximation to the posterior distribution which has the same functional form as the assigned prior distribution. The proposed algorithm requires no iterative numerical calculation in the re-estimation procedure, and it can potentially determine the model complexity and avoid the over-fitting problem associated with conventional approaches based on the expectation maximization. Moreover, we derive an analytically tractable approximation to the predictive density of the Bayesian mixture model of vMF distributions. The performance of the proposed approach is verified by experiments with both synthetic and real data.

Keyword
Bayesian estimation, von-Mises Fisher distribution, mixture model, variational inference, directional distribution, predictive density, gene expressions, speaker identification
National Category
Computer Science
Identifiers
urn:nbn:se:kth:diva-150510 (URN)10.1109/TPAMI.2014.2306426 (DOI)000340210100001 ()2-s2.0-84905593212 (Scopus ID)
Note

QC 20140916

Available from: 2014-09-16 Created: 2014-09-05 Last updated: 2017-12-05Bibliographically approved
2. Variational Inference for Watson Mixture Model
Open this publication in new window or tab >>Variational Inference for Watson Mixture Model
(English)In: IEEE Transaction on Pattern Analysis and Machine Intelligence, ISSN 0162-8828, E-ISSN 1939-3539Article in journal (Other academic) Submitted
Abstract [en]

This paper addresses modelling data using the multivariate Watson distributions. The Watson distribution is one of thesimplest distributions for analyzing axially symmetric data. This distribution has gained some attention in recent years due to itsmodeling capability. However, its Bayesian inference is fairly understudied due to difficulty in handling the normalization factor. Recentdevelopment of Monte-Carlo Markov chain (MCMC) sampling methods can be applied for this purpose. However, these methods canbe prohibitively slow for practical applications. A deterministic alternative is provided by variational methods that convert inferenceproblems into optimization problems. In this paper, we present a variational inference for Watson mixture model. First, the variationalframework is used to side-step the intractability arising from the coupling of latent states and parameters. Second, the variational freeenergy is further lower bounded in order to avoid intractable moment computation. The proposed approach provides a lower bound onthe log marginal likelihood and retains distributional information over all parameters. Moreover, we show that it can regulate its owncomplexity by pruning unnecessary mixture components while avoiding over-fitting. We discuss potential applications of the modelingwith Watson distributions in the problem of blind source separation, and clustering gene expression data sets.

Keyword
Bayesian inference, variational inference, Watson distribution, mixture model, axially symmetric, clustering on the unit hypersphere, blind source separation, gene expression
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:kth:diva-153780 (URN)
Note

QS 2014

Available from: 2014-10-08 Created: 2014-10-08 Last updated: 2017-12-05Bibliographically approved
3. Separation of Unknown Number of Sources
Open this publication in new window or tab >>Separation of Unknown Number of Sources
2014 (English)In: IEEE Signal Processing Letters, ISSN 1070-9908, E-ISSN 1558-2361, Vol. 21, no 5, 625-629 p.Article in journal (Refereed) Published
Abstract [en]

We address the problem of blind source separation in acoustic applications where there is no prior knowledge about the number of mixing sources. The presented method employs a mixture of complex Watson distributions in its generative model with a sparse Dirichlet distribution over the mixture weights. The problem is formulated in a fully Bayesian inference with assuming prior distributions over all model parameters. The presented model can regulate its own complexity by pruning unnecessary components by which we can possibly relax the assumption of prior knowledge on the number of sources.

Keyword
Bayesian inference, Blind source separation, Complex Watson distribution, Variational inference
National Category
Computer Science
Identifiers
urn:nbn:se:kth:diva-153744 (URN)10.1109/LSP.2014.2309607 (DOI)000347922600001 ()2-s2.0-84897498048 (Scopus ID)
Note

QC 20141007

Available from: 2014-10-08 Created: 2014-10-08 Last updated: 2017-12-05Bibliographically approved
4. Bayesian Recursive Blind Source Separation
Open this publication in new window or tab >>Bayesian Recursive Blind Source Separation
(English)In: Journal of machine learning research, ISSN 1532-4435, E-ISSN 1533-7928Article in journal (Other academic) Submitted
Abstract [en]

We consider the problem of blind source separation (BSS) of convolutive mixtures in underdeterminedscenarios, where there are more sources to estimate than recorded signals. This problemhas been intensively studied in the literature. Many successful methods relay on batch processingof previously recorded signals, and hence are only best suited for noncausal systems. This paperaddresses the problem of online BSS. To realize this, we develop a Bayesian recursive framework.The proposed Bayesian framework allows incorporating prior knowledge in a coherentway, and therecursive learning allows to combine information gained from the current observation with all informationfromthe previous observations. Experiments using live audio recordings show promisingresults.

Place, publisher, year, edition, pages
MIT Press
Keyword
Blind Source Separation, Two-dimensional Hidden Markov Models, Bayesian Learning, Variational Inference, Watson Distribution
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:kth:diva-153783 (URN)
Note

QS 2014

Available from: 2014-10-08 Created: 2014-10-08 Last updated: 2017-12-05Bibliographically approved

Open Access in DiVA

Thesis(1393 kB)586 downloads
File information
File name FULLTEXT01.pdfFile size 1393 kBChecksum SHA-512
ca0f56967e57a5555f8362ad4248a61261dd1d90d90cbf51a0c6974cca7c1946d924ff5eedc325a41fd5218f53c5948697658806591897c6e48cc800a05e8d44
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Taghia, Jalil
By organisation
Communication Theory
Engineering and TechnologyElectrical Engineering, Electronic Engineering, Information Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 586 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 2571 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf