Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: st: nbreg - problem with constant?


From   David Hoaglin <[email protected]>
To   [email protected]
Subject   Re: st: nbreg - problem with constant?
Date   Fri, 2 Mar 2012 16:54:18 -0500

Hi, Simon.

When you remove the constant from the model, some of the variation in
the dependent variable the was accounted for by the constant is then
accounted for (to the extent possible) by the predictors that remain
in the model.  The result is not necessarily to make the coefficients
of those predictors larger, but they will generally change.

Consider how removing the constant would work in ordinary multiple
regression.  If a predictor variable does not have mean 0 (in the
data), removing the constant will change its coefficient.  You can
even see this happen in simple regression when you fit a line that has
a slope and an intercept and then fit a line through the origin.  It's
easy to construct an example in which the two slopes have different
signs.  One has to keep in mind that the definition of a coefficient
in a regression (or similar) model depends on the list of other
predictors in the model.

In your negative binomial model, I don't think you want to take exp of
the coefficients and interpret the results as if they were the
coefficients.

Regards,

David Hoaglin
*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index