Bookmark and Share

Notice: On March 31, it was announced that Statalist is moving from an email list to a forum. The old list will shut down at the end of May, and its replacement, statalist.org is already up and running.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: st: data set too large


From   Austin Nichols <austinnichols@gmail.com>
To   statalist@hsphsun2.harvard.edu
Subject   Re: st: data set too large
Date   Thu, 7 Apr 2011 13:23:29 -0400

Susan L Averett <averetts@lafayette.edu>:
set mem 4000m
If you can't set memory high enough (but consider upgrading your OS or
RAM as necessary), you can edit the .dct file that the ECLS extraction
program spits out to remove any extraneous variables e.g. did you want
several hundred weight vars (all doubles)?  You can also downgrade the
storage type of many variables to byte, but that requires knowing
which variables are suitable.  Make sure you -compress- before you
-save- in any case.

On Wed, Apr 6, 2011 at 10:55 PM, Averett, Susan L
<averetts@lafayette.edu> wrote:
> Hi all: I am reading in a large data set, the ECLS and it comes with a Stata dictionary file so obviously it was meant to be read in Stata. I have Stata 11 SE but I cannot get it to load. It is too big.
> The error message (below) tells me to compress the data but how can I compress the data when it is not even read into Stata yet?
> Likewise, how can I drop observations? I am reading in a dictionary file. I've got set mem set as large as it will go for me:
> 1000m and same with max var..it is set to 30,000.
> Help!
*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2014 StataCorp LP   |   Terms of use   |   Privacy   |   Contact us   |   Site index