Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: st: data set too large

From   Austin Nichols <>
Subject   Re: st: data set too large
Date   Thu, 7 Apr 2011 13:23:29 -0400

Susan L Averett <>:
set mem 4000m
If you can't set memory high enough (but consider upgrading your OS or
RAM as necessary), you can edit the .dct file that the ECLS extraction
program spits out to remove any extraneous variables e.g. did you want
several hundred weight vars (all doubles)?  You can also downgrade the
storage type of many variables to byte, but that requires knowing
which variables are suitable.  Make sure you -compress- before you
-save- in any case.

On Wed, Apr 6, 2011 at 10:55 PM, Averett, Susan L
<> wrote:
> Hi all: I am reading in a large data set, the ECLS and it comes with a Stata dictionary file so obviously it was meant to be read in Stata. I have Stata 11 SE but I cannot get it to load. It is too big.
> The error message (below) tells me to compress the data but how can I compress the data when it is not even read into Stata yet?
> Likewise, how can I drop observations? I am reading in a dictionary file. I've got set mem set as large as it will go for me:
> 1000m and same with max is set to 30,000.
> Help!
*   For searches and help try:

© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index