×

zbMATH — the first resource for mathematics

A gradient algorithm locally equivalent to the EM algorithm. (English) Zbl 0813.62021
Summary: In many problems of maximum likelihood estimation, it is impossible to carry out either the E-step or the M-step of the EM algorithm. The present paper introduces a gradient algorithm that is closely related to the EM algorithm. This EM gradient algorithm approximately solves the M- step of the EM algorithm by one iteration of Newton’s method. Since Newton’s method converges quickly, the local properties of the EM gradient algorithm are almost identical with those of the EM algorithm. Any strict local maximum point of the observed likelihood locally attracts the EM and EM gradient algorithm at the same rate of convergence, and near the maximum point the EM gradient algorithm always produces an increase in the likelihood. With proper modification the EM gradient algorithm also exhibits global convergence properties that are similar to those of the EM algorithm. Our proof of global convergence applies and improves existing theory for the EM algorithm. These theoretical points are reinforced by a discussion of three realistic examples illustrating how the EM gradient algorithm can succeed where the EM algorithm is intractable.

MSC:
62F10 Point estimation
65C99 Probabilistic methods, stochastic differential equations
62F35 Robustness and adaptive procedures (parametric inference)
PDF BibTeX XML Cite