References$(function(){PrimeFaces.cw("TieredMenu","widget_formSmash_upper_j_idt145",{id:"formSmash:upper:j_idt145",widgetVar:"widget_formSmash_upper_j_idt145",autoDisplay:true,overlay:true,my:"left top",at:"left bottom",trigger:"formSmash:upper:referencesLink",triggerEvent:"click"});}); $(function(){PrimeFaces.cw("OverlayPanel","widget_formSmash_upper_j_idt146_j_idt148",{id:"formSmash:upper:j_idt146:j_idt148",widgetVar:"widget_formSmash_upper_j_idt146_j_idt148",target:"formSmash:upper:j_idt146:permLink",showEffect:"blind",hideEffect:"fade",my:"right top",at:"right bottom",showCloseIcon:true});});

Bayesian learning of Gaussian mixtures: Variational "over-pruning" revisitedPrimeFaces.cw("AccordionPanel","widget_formSmash_some",{id:"formSmash:some",widgetVar:"widget_formSmash_some",multiple:true}); PrimeFaces.cw("AccordionPanel","widget_formSmash_all",{id:"formSmash:all",widgetVar:"widget_formSmash_all",multiple:true});
function selectAll()
{
var panelSome = $(PrimeFaces.escapeClientId("formSmash:some"));
var panelAll = $(PrimeFaces.escapeClientId("formSmash:all"));
panelAll.toggle();
toggleList(panelSome.get(0).childNodes, panelAll);
toggleList(panelAll.get(0).childNodes, panelAll);
}
/*Toggling the list of authorPanel nodes according to the toggling of the closeable second panel */
function toggleList(childList, panel)
{
var panelWasOpen = (panel.get(0).style.display == 'none');
// console.log('panel was open ' + panelWasOpen);
for (var c = 0; c < childList.length; c++) {
if (childList[c].classList.contains('authorPanel')) {
clickNode(panelWasOpen, childList[c]);
}
}
}
/*nodes have styleClass ui-corner-top if they are expanded and ui-corner-all if they are collapsed */
function clickNode(collapse, child)
{
if (collapse && child.classList.contains('ui-corner-top')) {
// console.log('collapse');
child.click();
}
if (!collapse && child.classList.contains('ui-corner-all')) {
// console.log('expand');
child.click();
}
}
PrimeFaces.cw("AccordionPanel","widget_formSmash_responsibleOrgs",{id:"formSmash:responsibleOrgs",widgetVar:"widget_formSmash_responsibleOrgs",multiple:true}); 2013 (English)Report (Other academic)
##### Abstract [en]

##### Place, publisher, year, edition, pages

Stockholm: KTH Royal Institute of Technology, 2013. , 29 p.
##### Series

Trita-EE, ISSN 1653-5146 ; 2013:032
##### Keyword [en]

Machine learning; Bayesian; Variational
##### National Category

Telecommunications
##### Identifiers

URN: urn:nbn:se:kth:diva-125832OAI: oai:DiVA.org:kth-125832DiVA: diva2:640979
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt375",{id:"formSmash:j_idt375",widgetVar:"widget_formSmash_j_idt375",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt381",{id:"formSmash:j_idt381",widgetVar:"widget_formSmash_j_idt381",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt387",{id:"formSmash:j_idt387",widgetVar:"widget_formSmash_j_idt387",multiple:true});
##### Note

This study reconsiders two simple toy data examples proposed by MacKay (2001) to illustrate what he called “symmetry-breaking” and inappropriate “over-pruning” by the variational inference (VI) approximation in Bayesian learning of probabilistic mixture models.

The exact Bayesian solution is derived formally, including the effects of parameter values in the prior distribution of mixture weights. The exact solution is then compared to the results of VI approximation.

In both toy examples both the exact solution and the VI approxi- mation normally assigned each data cluster entirely to its own mixture component. In both methods the number of active mixture components is normally the same as the number of data clusters. In this sense, the VI approach causes no “over-pruning”. In one extreme example with two clusters with only 1 and 3 samples, and very small parameter values in the prior Dirichlet distribution of mixture weights, the exact Bayesian solution assigned all samples to the same component, i.e., with “over-pruning”, whereas the VI approximation still converged to a solution using both mixture components, i.e., with no “over-pruning”. Thus, if inappropriate over-pruning occurs, it is probably caused by inappropriate selection of prior model parameters, and not by the VI approach.

The VI approximation shows “symmetry-breaking” because it converges to one of the arbitrary and equivalent permutations of the indices of mixture components. The “symmetric” exact solution formally in- cludes all these permutations, but this is precisely what makes the exact Bayesian solution computationally impractical. Thus, in these toy examples, we must conclude that “symmetry-breaking” is not the same thing as “over-pruning”. The VI approximation shows “symmetry-breaking” but no “over-pruning”.

QC 20130816

Available from: 2013-08-15 Created: 2013-08-15 Last updated: 2013-08-16Bibliographically approvedReferences$(function(){PrimeFaces.cw("TieredMenu","widget_formSmash_lower_j_idt1088",{id:"formSmash:lower:j_idt1088",widgetVar:"widget_formSmash_lower_j_idt1088",autoDisplay:true,overlay:true,my:"left top",at:"left bottom",trigger:"formSmash:lower:referencesLink",triggerEvent:"click"});}); $(function(){PrimeFaces.cw("OverlayPanel","widget_formSmash_lower_j_idt1089_j_idt1091",{id:"formSmash:lower:j_idt1089:j_idt1091",widgetVar:"widget_formSmash_lower_j_idt1089_j_idt1091",target:"formSmash:lower:j_idt1089:permLink",showEffect:"blind",hideEffect:"fade",my:"right top",at:"right bottom",showCloseIcon:true});});