Stata The Stata listserver
[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]

Re: st: another large data set question


From   "Oleksandr Talavera, EUV" <[email protected]>
To   [email protected]
Subject   Re: st: another large data set question
Date   Mon, 29 Mar 2004 12:09:15 +0200

As for my knowledge, I do not think it is possible for Stata to use 17
Gb data file. Even if you computer has 5Gb of  RAM, an operation system
( I used UNIX server once) may give for Stata only 2GB.

best,
sasha

Ramani Gunatilaka wrote:

Hi all,
Stas Kolenikov's Shapley.ado allows up to 56 variables. I am trying to do a decomposition with 29 variables. But the zero-one matrix for 29 variables would require 16896 MB of memory (ref. http://www.stata.com/support/faqa/data/howbig.html).
Stata doesn't seem to have a problem generating the matrix (I tried it upto two gigs before realising that my computer won't have enough disk space to store it).
But even if I were to use a more powerful computer, I am worried that Stata may not be able to use this file. Can someone please tell me, which version of Stata for which windows operating system can handle this large a data set? Also, roughly how much RAM would I require and how much free disk space?
I referred all the FAQ's and didn't come up with anything very useful in relation to my particular question.
Thanks in advance,
Regards,
Ramani
*
*   For searches and help try:
*   http://www.stata.com/support/faqs/res/findit.html
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/





*
*   For searches and help try:
*   http://www.stata.com/support/faqs/res/findit.html
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/



© Copyright 1996–2024 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index