Bookmark and Share

Notice: On March 31, it was announced that Statalist is moving from an email list to a forum. The old list will shut down at the end of May, and its replacement, statalist.org is already up and running.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

st: RE: Large data set that won't open


From   "Martin Weiss" <martin.weiss1@gmx.de>
To   <statalist@hsphsun2.harvard.edu>
Subject   st: RE: Large data set that won't open
Date   Thu, 24 Jun 2010 10:05:55 +0200

<>

The size of your RAM is hard to tell from your current memory allocation -
the latter is just a lower bound for the former. But I guess you do not have
6G of RAM, do you? You would still need additional RAM for computations,
then. 

Note you can -use- a dataset with -if- and -in- qualifiers, so it is still
possible to get a subset into mem...


HTH
Martin

-----Original Message-----
From: owner-statalist@hsphsun2.harvard.edu
[mailto:owner-statalist@hsphsun2.harvard.edu] On Behalf Of John Antonakis
Sent: Donnerstag, 24. Juni 2010 09:37
To: statalist@hsphsun2.harvard.edu
Subject: st: Large data set that won't open

Hi:

I have a rather large data set that I cannot open in Stata; here are the 
particulars:

ls "D:\My Documents\STATA\upstata.dta",
5375.8M   6/23/10 10:44  upstata.dta  

When I try to open the data set I get:

. use "D:\My Documents\STATA\upstata.dta", clear
no room to add more observations
    An attempt was made to increase the number of observations beyond 
what is currently
    possible.  You have the following alternatives:

     1.  Store your variables more efficiently; see help compress.  
(Think of Stata's data
         area as the area of a rectangle; Stata can trade off width and 
length.)

     2.  Drop some variables or observations; see help drop.

     3.  Increase the amount of memory allocated to the data area using 
the set memory
         command; see help memory.
r(901);

Here is my current memory allocation:

. q memory

Current memory allocation

                    current                                 memory usage
    settable          value     description                 (1M = 1024k)
    --------------------------------------------------------------------
    set maxvar         5000     max. variables allowed           1.909M
    set memory         1024M    max. data space              1,024.000M
    set matsize        1000     max. RHS vars in models          7.713M
                                                            -----------
                                                             1,033.622M


It seems that I just don't have enough RAM to open this file.  Is that 
correct?

Best,
J.

-- 
____________________________________________________

Prof. John Antonakis, Associate Dean 
Faculty of Business and Economics
Department of Organizational Behavior
University of Lausanne
Internef #618
CH-1015 Lausanne-Dorigny
Switzerland

Tel ++41 (0)21 692-3438
Fax ++41 (0)21 692-3305

Faculty page:
http://www.hec.unil.ch/people/jantonakis

Personal page:
http://www.hec.unil.ch/jantonakis
____________________________________________________

*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/

*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2014 StataCorp LP   |   Terms of use   |   Privacy   |   Contact us   |   Site index