[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: st: finite mixture models with the EM algorithm

From   Stas Kolenikov <[email protected]>
To   [email protected]
Subject   Re: st: finite mixture models with the EM algorithm
Date   Thu, 17 Sep 2009 15:59:45 -0500

On Thu, Sep 17, 2009 at 3:22 PM, Partha Deb <[email protected]> wrote:
> Also, while EM might be able
> to deal with multimodalities better than ML (esp. Stata's -ml-), there is
> nothing inherent about EM that makes it "better".

If anything, EM is actually worse than the brute force -ml- when it
comes to multimodality and false convergence declaration. (I am
surprised that I have to disagree with Imbens :)) There are known
examples when EM converges to saddlepoints of the likelihood
(-difficult- option would probably be able to pull maximization out of
it, however). McLachlan & Krishnan
give an example of this behavior (at least I remember one in the first
edition, it would be strange if they took it away in the second
edition). Also, since neither the likelihood nor its gradient are
computed, the algorithm has to rely only on -tolerance- (in terms of
Stata's -maximize-), which is the worst convergence criterion to work

Stas Kolenikov, also found at
Small print: I use this email account for mailing lists only.

*   For searches and help try:

© Copyright 1996–2024 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index