Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: st: nbreg - problem with constant?


From   Simon Falck <[email protected]>
To   "[email protected]" <[email protected]>
Subject   RE: st: nbreg - problem with constant?
Date   Mon, 5 Mar 2012 13:54:45 +0000

...thanks also to Dave Jacobs! :D

/Simon

-----Original Message-----
From: Simon Falck 
Sent: den 5 mars 2012 14:54
To: '[email protected]'
Subject: RE: st: nbreg - problem with constant?

Dear David Hoaglin, Joerg, David Hoaglin and Richard Williams,

Thanks for your insights on the nbreg model!!

I will consider these insights in the forthcoming.

Thank you again,
/Simon







-----Original Message-----
From: [email protected] [mailto:[email protected]] On Behalf Of Jacobs, David
Sent: den 4 mars 2012 18:12
To: '[email protected]'
Subject: RE: st: nbreg - problem with constant?

Another simple step is to look at the actual size of your explanatory variables.  

I've often found that zero inflated count models have problems converging, but these problems are much diminished if I divide extremely large explanatory variables like state or city populations by a constant such as 10,000 or 100,000 if necessary because, I suppose, this expedient gets everything at about the same numeric level.

Dave Jacobs

-----Original Message-----
From: [email protected] [mailto:[email protected]] On Behalf Of David Hoaglin
Sent: Friday, March 02, 2012 4:56 PM
To: [email protected]
Subject: Re: st: nbreg - problem with constant?

Hi, Simon.

When you remove the constant from the model, some of the variation in the dependent variable the was accounted for by the constant is then accounted for (to the extent possible) by the predictors that remain in the model.  The result is not necessarily to make the coefficients of those predictors larger, but they will generally change.

Consider how removing the constant would work in ordinary multiple regression.  If a predictor variable does not have mean 0 (in the data), removing the constant will change its coefficient.  You can even see this happen in simple regression when you fit a line that has a slope and an intercept and then fit a line through the origin.  It's easy to construct an example in which the two slopes have different signs.  One has to keep in mind that the definition of a coefficient in a regression (or similar) model depends on the list of other predictors in the model.

In your negative binomial model, I don't think you want to take exp of the coefficients and interpret the results as if they were the coefficients.

Regards,

David Hoaglin
*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/

*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/

*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index