Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

re: Re: st: xtreg: double format and memory issues


From   Christopher Baum <[email protected]>
To   "[email protected]" <[email protected]>
Subject   re: Re: st: xtreg: double format and memory issues
Date   Thu, 24 Mar 2011 17:12:46 -0400

<>
Xtreg, at least in the way I use it and I reported here, takes out the
means for the individual_by_year, but introduce 720 dummy variables
for day FE. That suggests that we need something like 550000
observations*720 day FE* 8 bytes = 3,186 megabytes + 550,000
observations*13 variables*8 bytes .= 6.6 megabyte. More or less 3200
megabytes, but I can allocate only 2900 mb.

Do the time demeaning yourself: it is just transforming the data as y_i,t - y_.,t where y_.,t is the mean value of y across panels for that point in time. You 
can compute those means with egen, or use Ben Jann's -center- command with by time: prefix.  Then just run xtreg, fe on the transformed data (you must transform all variables that go into the model, of course). Conceptually you should adjust the s.e.'s for the loss of 720 more d.f., but with over 1/2 million obs, that would be a waste of time.

Kit

Kit Baum   |   Boston College Economics & DIW Berlin   |   http://ideas.repec.org/e/pba1.html
                              An Introduction to Stata Programming  |   http://www.stata-press.com/books/isp.html
   An Introduction to Modern Econometrics Using Stata  |   http://www.stata-press.com/books/imeus.html




*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index