Stata The Stata listserver
[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]

st: Large data sets


From   celdjt@umich.edu
To   "'statalist@hsphsun2.harvard.edu'" <statalist@hsphsun2.harvard.edu>
Subject   st: Large data sets
Date   Fri, 23 Aug 2002 14:44:25 -0400

I am working with a fairly large data base using survival anlysis and am at the point of doing analysis using time varying covariates. When I attempt to stsplit the data, the program tells me the expanded data base will contain over 2.3 million observations and needs 486 Mgs of memory. My approach has been to restart the analysis, set virtual memory on, set memory to 512M (full amount of my RAM), stset the data, and run the stsplit command. The program will start to run but invaribly stops responding. Before I spend money on addtional RAM, I was wondering if there was any suggestions or advice. Modelling requirements rule out cox's regression with tvc. Thanks. *
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/




© Copyright 1996–2014 StataCorp LP   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index