Statalist


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

st: increasing time burden during resampling


From   "Feiveson, Alan H. (JSC-SK311)" <[email protected]>
To   "[email protected]" <[email protected]>
Subject   st: increasing time burden during resampling
Date   Mon, 28 Dec 2009 10:18:52 -0600

Hi - I am running Stata 11 on Windows XP and I am implementing a form of multiple testing using a resampling method as described in Westfall & Young's text. Basically, for each iteration the method is: 

1)resample the data modified to a joint null situation (I use the Stata -bs- command for this) 
2) fit a model to the resampled modified data 
3) do a bunch of tests using -test-  
4) save the test results.

These operations are identical for each iteration, yet the time per iteration increases roughly linearly until it becomes prohibitive to continue. 

If I stop after a given number of iterations, I must close out all Stata processes before starting again, else the bogged down state still holds.

Previous versions of this that used the estimated coefficients and standard errors without -test- didn't seem to have this problem. So I suspect it has something to do with repeated use of the -test- command. Maybe something needs to be reset or cleared after each iteration?

I tried increasing the memory allocated to Stata, but that didn't seem to help. I would appreciate any suggestions for improving efficiency here.


Al Feiveson

*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/



© Copyright 1996–2024 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index