Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

st: xtreg with large datasets


From   Patrick Roland <[email protected]>
To   [email protected]
Subject   st: xtreg with large datasets
Date   Mon, 8 Nov 2010 22:26:51 -0800

Hi all,

I have a large panel dataset. I'd like to include individual and time
fixed effects with clustered standard errors. One way to do this would
be:

xtreg y x i.time, fe cluster(individual)

The problem is that the dataset is large and this command requires
lots of memory (>10gb). I was wondering if there is any way to demean
the data so I can obtain estimates of the coefficient on the x vector
without having to include the time dummies as regressors.

With a balanced panel, one can subtract individual and time means from
y and x and simply regress y on x. This gives the same point estimates
as the xtreg command above, although different standard errors. Does
anyone know how to proceed in the unbalanced panel case? Also, how
might one obtain correct standard errors from this regression (i.e.
those given by xtreg)?

It seems to me that this can't be an uncommon problem. Anyone
including time effects in a fixed effects regression with a large
panel dataset must encounter it. Any help would be greatly
appreciated.

Thanks,

Patrick.
*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index