[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

From |
"Victor M. Zammit" <[email protected]> |

To |
<[email protected]> |

Subject |
Re: st: Re: Memory |

Date |
Thu, 13 Nov 2008 09:41:19 +0100 |

```
* a} The data that I have is from generating random samples of whatever
size,in this case of size 31,from a normally distributed,infinitely large,
population; ie
local i = 1
while `i'<= 40000 {
drop _all
set obs 31
gen a = invnorm(uniform())
qui ttest a = 0
replace a = r(t) in 1
keep in 1
save a`i',replace
local i = `i'+1
}
* I use 40000 due to memory constraint.Appending the a[i]'s together gives
me a variable of 40000 observations ,ie
use a1,clear
local i = 2
while `i'<= 40000 {
append using a`i'.dta
local i = `i'+1
}
save ais40000,replace
* b) From ais40000.dta I get the density <= 1.31, presumably to get the
density of 90% , <= 1.697 to get the density of 95% etc etc,according to the
official ttable, ie
capture program drop density
program define density
use ais40000,clear
count if a<= `1'
di " density >=" "`1'" " = " r(N)/40000
end
density 1.31
density 1.697
density 2.042
density 2.457
density 2.75
* For smaller degrees of freedom,the discrepancy is much higher.I would like
to know how if it is at all possible to resolve memory constraint .
----- Original Message -----
From: "Kieran McCaul" <[email protected]>
To: <[email protected]>
Sent: Thursday, November 13, 2008 5:44 AM
Subject: RE: st: Re: Memory
> A couple of people have responded to your previous emails and the
> general thrust of these responses would suggest that no one quite
> understands what it is that you are trying to do. Frankly, I don't
> understand what it is that you are trying to do.
>
> This makes it difficult to give advice.
>
> So, to resolve this, why don't you explain:
>
> a) what data you have, and
> b) what question or hypothesis you are seeking to address from these
> data
>
>
>
> ______________________________________________
> Kieran McCaul MPH PhD
> WA Centre for Health & Ageing (M573)
> University of Western Australia
> Level 6, Ainslie House
> 48 Murray St
> Perth 6000
> Phone: (08) 9224-2140
> Fax: (08) 9224 8009
> email: [email protected]
> http://myprofile.cos.com/mccaul
> _______________________________________________
> The fact that no one understands you doesn't make you an artist.
>
>
> -----Original Message-----
> From: [email protected]
> [mailto:[email protected]] On Behalf Of Victor M.
> Zammit
> Sent: Thursday, 13 November 2008 2:52 AM
> To: [email protected]
> Subject: Re: st: Re: Memory
>
> I have tried to take the average of resulting t-tables after appending
> 40,000 ttests each time,but this practice never seem to converge to one
> value ,not even after days of calculations .
> I have also tried to work with the significant level only ,but this did
> not
> get me anywhere,either.
>
>
>
>
> ----- Original Message -----
> From: "Maarten buis" <[email protected]>
> To: <[email protected]>
> Sent: Wednesday, November 12, 2008 6:33 PM
> Subject: Re: st: Re: Memory
>
>
> > --- "Victor M. Zammit" <[email protected]> wrote:
> > > I need to t-test an infinit number of random samples of size 30
> ,from
> > > an infinite,normally distributed population. Each t-value is saved
> > > and then appended together to form a database for t table.The
> problem
> > > is that I get constrained by memory regardless of the size of the
> > > memory in my Stata. I have Version 9 .Is there a way of getting
> > > around this constrained.?
> >
> > If you want to create a table using an infinite number of random
> > samples you need an infinite amount of time and an infinite amount of
> > memory, regardless of what program or computer you use. So that will
> > never work.
> >
> > -- Maarten
> >
> >
> > -----------------------------------------
> > Maarten L. Buis
> > Department of Social Research Methodology
> > Vrije Universiteit Amsterdam
> > Boelelaan 1081
> > 1081 HV Amsterdam
> > The Netherlands
> >
> > visiting address:
> > Buitenveldertselaan 3 (Metropolitan), room N515
> >
> > +31 20 5986715
> >
> > http://home.fsw.vu.nl/m.buis/
> > -----------------------------------------
> >
> >
> >
> > *
> > * For searches and help try:
> > * http://www.stata.com/help.cgi?search
> > * http://www.stata.com/support/statalist/faq
> > * http://www.ats.ucla.edu/stat/stata/
> >
>
>
> *
> * For searches and help try:
> * http://www.stata.com/help.cgi?search
> * http://www.stata.com/support/statalist/faq
> * http://www.ats.ucla.edu/stat/stata/
>
>
>
> *
> * For searches and help try:
> * http://www.stata.com/help.cgi?search
> * http://www.stata.com/support/statalist/faq
> * http://www.ats.ucla.edu/stat/stata/
>
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/
```

**Follow-Ups**:**RE: st: Re: Memory***From:*"Nick Cox" <[email protected]>

**References**:**Re: st: Re: Memory***From:*"Victor M. Zammit" <[email protected]>

**RE: st: Re: Memory***From:*"Kieran McCaul" <[email protected]>

- Prev by Date:
**Re: st: if and if** - Next by Date:
**st: R: if and if** - Previous by thread:
**RE: st: Re: Memory** - Next by thread:
**RE: st: Re: Memory** - Index(es):

© Copyright 1996–2024 StataCorp LLC | Terms of use | Privacy | Contact us | What's new | Site index |