Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

st: xtlogit postestimation with predict


From   Dave Ohls <[email protected]>
To   [email protected]
Subject   st: xtlogit postestimation with predict
Date   Tue, 9 Jul 2013 15:27:28 -0400

I am getting inconsistent sets of results using the -predict- command
for postestimation predicted probabilities after -xtlogit- models.
I'm using Stata/IC 11.2 for Windows.

I am estimating fixed effects logit models using code of the form:
-xtlogit DV IV1 IV2 CV1 CV2 if CV3==1, fe-
and want to interpret substantive results on continuous IV1 in terms
of predicted probabilities at different values.  Because effects are
non-linear and dependent on values of the FE/other vars, I'm
considering these within specific substantively-important cases.

To do so, I create 5-10 dummy copies (labeled with a 1 in a variable
called dummy) of a particular case and delete the dependent variable
so as not to include it in the estimation of the model itself.  I keep
all variable values as they are in the real case, except altering IV1
to set it at its minimum for one of the copies, mean in another, max
in another, mean plus 1 SD in another, etc.  I then estimate the
model, followed by postestimation commands.

The problem is that I get very different sets of results when I run:
-predict p1 if dummy==1-
than when I run:
-predict p2-
The numbers aren't the same even within those cases (dummy==1) where I
get a predicted probability in each.  I assume this is something to do
with how it handles the fixed effects, but I can't tell from the
manual/past forum topics/etc what it is, or which is correct.

Also, I get a totally different (third) set of results when I run:
-predict p3, pu0-
Given info in the manual I interpret this set as the predicted
probabilities when the FEs are set to 0, which is not substantively
correct for what I'm trying to do - I include it here only to show
that that's not what's happening in either set of results above.

I have tried replicating this on other datasets and can't get the same
inconsistency.  Any ideas?

Thanks so much for your time.

-Dave
*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/faqs/resources/statalist-faq/
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index