Handling empty components in EM implementation for GMM learning

338 views Asked by At

I am trying to implement learning a Gaussian Mixture Model using EM from scratch in MATLAB. The project requires some later modifications to the standard GMM model, which is why I am not using off-the-shelf implementations such as VLFeat or the Stats Toolbox. Rolling out an implementation would be a learning experience and be easily customizable later on.

Specifically, coding EM for a GMM with spherical covariances.

  1. Handling empty clusters. I am having trouble handling the case when some components of the GMM are not assigned any data - they have zero or negligible posterior probability mass. This case arises when there are a large number of clusters defined. What is the standard way of handling this case?

  2. Intuitively, I would select the component with highest covariance and assign half of its data to the empty component.

My question is: is there a standard and principled way of handling this in EM implementations (which I haven't managed to find via Google)?

1

There are 1 answers

0
Has QUIT--Anony-Mousse On

Empty components in GMM should not arise.

Usually, you do soft assignments, so at least a tiny fraction of some object will remain in every component. This is why you need a convergence threshold for EM.