Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: st: GLLAMM problem in comprob3


From   Stas Kolenikov <[email protected]>
To   [email protected]
Subject   Re: st: GLLAMM problem in comprob3
Date   Sat, 5 May 2012 11:18:07 -0500

I imagine that -gllamm- tries different approximations to the
likelihood (e.g., when switching from adaptive to regular quadrature),
but since your data are so sparse, none of these approximations is
accurate enough, so they both just break down. Your model is most
likely empirically underidentified: if an individual has all zeroes,
then there is no way to pull out any reasonable estimate for that
person's random effect. You simply need to have more ones to get any
meaningful results; no sensible method will help you with that.

For plain data, there's Firth's logistic regression (-findit
firthlogit-) that is similar to ridge regression in that it shrinks
the parameters towards zero by imposing a likelihood penalty that is
equivalent to Jeffrey's prior. I am not sure if there are any
extensions of that for multilevel data.

On Sat, May 5, 2012 at 6:55 AM, P.T.Dijkstra <[email protected]> wrote:
> Dear all,
> While using the GLLAMM package, I sometimes obtain the following error message:
>> can't get correct log-likelihood: -31.09976 should be -31.123589
>> something went wrong in comprob3
>>
>
>
> I am under the impression that GLLAMM cannot estimate the model correctly because I have almost no "ones" in the dataset.
> There are two groups: group 1 has just 20 times a one out of 970 observations, group 2 has only 2 ones out of 1000 observations.
>
>
> I try to estimate a group effect as follows:
> gllamm success groupeffect if group <= 2, i(personID group) family(binom) link(logit) robust adapt search(25) iterate(250)
>
>
> The output:
> Running adaptive quadrature
> Iteration 0:    log likelihood = -37.679599
> Iteration 1:    log likelihood = -33.411411
> (...)
> Iteration 249:    log likelihood = -31.095211
> Iteration 250:    log likelihood = -31.195684
>
> Adaptive quadrature has converged, running Newton-Raphson
> Iteration 0:   log likelihood = -31.195684  (not concave)
> Iteration 1:   log likelihood = -31.135656
> (...)
> Iteration 4:   log likelihood = -31.123589
> can't get correct log-likelihood: -31.09976 should be -31.123589
> something went wrong in comprob3
> r(198);
>
>
>
>
> Does anyone recognize this problem with the same data structure or is there anything else I can do?
> I know that this kind of problem has been posted before, but any help is highly appreciated.
>
>
> Best,
> Peter Dijkstra
> University of Groningen
>
>
>
> *
> *   For searches and help try:
> *   http://www.stata.com/help.cgi?search
> *   http://www.stata.com/support/statalist/faq
> *   http://www.ats.ucla.edu/stat/stata/



-- 
---- Stas Kolenikov
-- http://stas.kolenikov.name
---- Senior Survey Statistician, Abt SRBI
-- Opinions stated in this email are mine only, and do not reflect the
position of my employer

*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index