I have tried to take the average of resulting t-tables after appending
40,000 ttests each time,but this practice never seem to converge to one
value ,not even after days of calculations .
I have also tried to work with the significant level only ,but this did not
get me anywhere,either.
----- Original Message -----
From: "Maarten buis" <[email protected]>
To: <[email protected]>
Sent: Wednesday, November 12, 2008 6:33 PM
Subject: Re: st: Re: Memory
> --- "Victor M. Zammit" <[email protected]> wrote:
> > I need to t-test an infinit number of random samples of size 30 ,from
> > an infinite,normally distributed population. Each t-value is saved
> > and then appended together to form a database for t table.The problem
> > is that I get constrained by memory regardless of the size of the
> > memory in my Stata. I have Version 9 .Is there a way of getting
> > around this constrained.?
>
> If you want to create a table using an infinite number of random
> samples you need an infinite amount of time and an infinite amount of
> memory, regardless of what program or computer you use. So that will
> never work.
>
> -- Maarten
>
>
> -----------------------------------------
> Maarten L. Buis
> Department of Social Research Methodology
> Vrije Universiteit Amsterdam
> Boelelaan 1081
> 1081 HV Amsterdam
> The Netherlands
>
> visiting address:
> Buitenveldertselaan 3 (Metropolitan), room N515
>
> +31 20 5986715
>
> http://home.fsw.vu.nl/m.buis/
> -----------------------------------------
>
>
>
> *
> * For searches and help try:
> * http://www.stata.com/help.cgi?search
> * http://www.stata.com/support/statalist/faq
> * http://www.ats.ucla.edu/stat/stata/
>
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/