×

A view of the EM algorithm that justifies incremental, sparse, and other variants. (English) Zbl 0916.62019

Jordan, Michael I. (ed.), Learning in graphical models. Proceedings of the NATO ASI, Ettore Maiorana Centre, Erice, Italy, September 27 - October 7, 1996. Dordrecht: Kluwer Academic Publishers. NATO ASI Series. Series D. Behavioural and Social Sciences. 89, 355-368 (1998).
Summary: The EM algorithm performs maximum likelihood estimation for data in which some variables are unobserved. We present a function that resembles negative free energy and show that the M step maximizes this function with respect to the model parameters and the E step maximizes it with respect to the distribution over the unobserved variables. From this perspective, it is easy to justify an incremental variant of the EM algorithm in which the distribution for only one of the unobserved variables is recalculated in each E step. This variant is shown empirically to give faster convergence in a mixture estimation problem. A variant of the algorithm that exploits sparse conditional distributions is also described, and a wide range of other variant algorithms are also seen to be possible.
For the entire collection see [Zbl 0889.00024].

MSC:

62F10 Point estimation
65C99 Probabilistic methods, stochastic differential equations

Keywords:

EM algorithm
PDF BibTeX XML Cite