Bookmark and Share

Notice: On March 31, it was announced that Statalist is moving from an email list to a forum. The old list will shut down at the end of May, and its replacement, statalist.org is already up and running.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

st: R squared of OLS with dummy variables


From   Stefano Lugo <stefano.lugo@mail.polimi.it>
To   statalist@hsphsun2.harvard.edu
Subject   st: R squared of OLS with dummy variables
Date   Fri, 06 Jul 2012 10:57:34 +0200

I am estimating an OLS model which include dummy variables along with other regressors. If - instead of dropping one dummy due to collinearity - I drop the constant, I get the same estimation for variables (including the dummy/constant) coefficients and standard errors as expected but a very different R squared of the model.

Computing the R squared myself with
reg y  x_a x_b d_1 d_2 d_3, nocons
predict fit if e(sample)
corr y x
di r(rho)^2

I get instead the same R squared showed by Stata when dropping the dummy instead of the constant. To check whether the problem is somehow in my data, I've tried to repeat the same thing with simulated data and got the same problem. I have also tried using an other software and it reports instead the same R squared for both specifications.

Is that a Stata bug or am I missing some theoretical explanation for this?

Thank you for your help

--
Stefano Lugo, PhD
Politecnico di Milano
P.za Leonardo da Vinci 32
20133 Milan, Italy

*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2014 StataCorp LP   |   Terms of use   |   Privacy   |   Contact us   |   Site index