Bookmark and Share

Notice: On March 31, it was announced that Statalist is moving from an email list to a forum. The old list will shut down on April 23, and its replacement, statalist.org is already up and running.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

st: Drop in R-squared when adding variables in xtreg


From   "Gaetano Dadamo" <gaetano.dadamo@uv.es>
To   <statalist@hsphsun2.harvard.edu>
Subject   st: Drop in R-squared when adding variables in xtreg
Date   Mon, 22 Apr 2013 09:46:42 +0200

Dear statalisters,

I’ve been performing a FE regression with Stata and I have puzzling results: for example, I run the model

xtreg y y1 x year_dummies, fe cluster(country)

where y1 is the first lag of y, and get an overall R-squared of 0.71. Then, I want to see the effect of institutional variable z on the coefficient of y1 and z, so I run the regression

xtreg y y1 x z*y1 z*x year_dummies, fe cluster(country) 

but my overall R-squared falls to 0.21. I have the same number of observation in both samples. It is the between R-squared that falls a lot.

Why is that? Shouldn’t the explicative power of the regression not fall when adding variables anyways? 

I have that results with two different institutional variables: one is Union Density which is quite variable across and within units, so it definitely cannot be a problem of multicollinearity; the other one is a dummy for New Member States of EU which is constant within countries (but, since it is interacted, does not drop out of the system). Here R-squared falls from 0.71 to 0.06.

Is there a problem with the estimation? More generally: is the (overall) R-squared the best way for looking at the goodness of fit here?

Thank you so much.

Gaetano


*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/faqs/resources/statalist-faq/
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2014 StataCorp LP   |   Terms of use   |   Privacy   |   Contact us   |   Site index