Statalist


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: st: Re: Memory


From   "Victor M. Zammit" <vmz@vol.net.mt>
To   <statalist@hsphsun2.harvard.edu>
Subject   Re: st: Re: Memory
Date   Fri, 14 Nov 2008 14:29:24 +0100

I haven't for the past six month or so because the updating process was
taking too long and I had to stop the process prematurely.I have Version
9.1.Could this program be modified a little to be compatible with earlier
full updating .


-- Original Message ----- 
From: "Kieran McCaul" <kamccaul@meddent.uwa.edu.au>
To: <statalist@hsphsun2.harvard.edu>
Sent: Friday, November 14, 2008 1:47 AM
Subject: RE: st: Re: Memory


> This program works for me.
>
> Have you updated Stata recently?
>
> ______________________________________________
> Kieran McCaul MPH PhD
> WA Centre for Health & Ageing (M573)
> University of Western Australia
> Level 6, Ainslie House
> 48 Murray St
> Perth 6000
> Phone: (08) 9224-2140
> Fax: (08) 9224 8009
> email: kamccaul@meddent.uwa.edu.au
> http://myprofile.cos.com/mccaul
> _______________________________________________
> The fact that no one understands you doesn't make you an artist.
>
> -----Original Message-----
> From: owner-statalist@hsphsun2.harvard.edu
> [mailto:owner-statalist@hsphsun2.harvard.edu] On Behalf Of Victor M.
> Zammit
> Sent: Friday, 14 November 2008 8:38 AM
> To: statalist@hsphsun2.harvard.edu
> Subject: Re: st: Re: Memory
>
> When I tried that program I got the error  : "option gweight not allowed
> "
> and then I gave up on it .What can I do to avoid getting that error?
>
>
>
>
> . capture prog drop makez
>
> . prog makez, rclass
>   1.  syntax [, obs(integer 31)]
>   2.  clear
>   3.  set obs `obs'
>   4.  gen a = invnorm(uniform())
>   5.  ttest a=0
>   6.  return scalar t=r(t)
>   7. end
>
> . simul, reps(100000) seed(123): makez
> option gweight not allowed
> r(198);
>
> ----- Original Message ----- 
> From: "Austin Nichols" <austinnichols@gmail.com>
> To: <statalist@hsphsun2.harvard.edu>
> Sent: Thursday, November 13, 2008 7:27 PM
> Subject: Re: st: Re: Memory
>
>
> > Victor M. Zammit <vmz@vol.net.mt>:
> > I don't understand this reply at all.  Did you try the simulation
> > approach?  It takes very little memory:
> >
> > Contains data from \t.dta
> >   obs:       100,000                          simulate: makez
> >  vars:             1                          13 Nov 2008 11:04
> >  size:     1,200,000 (99.9% of memory free)
> >
> > g d=0
> > replace d=1.31  if t>1.31
> > replace d=1.697 if t>1.697
> > replace d=2.042 if t>2.042
> > replace d=2.457 if t>2.457
> > replace d=2.75  if t>2.75
> > tab d
> >
> >           d |      Freq.     Percent        Cum.
> > ------------+-----------------------------------
> >           0 |     90,017       90.02       90.02
> >        1.31 |      4,974        4.97       94.99
> >       1.697 |      2,441        2.44       97.43
> >       2.042 |      1,523        1.52       98.95
> >       2.457 |        552        0.55       99.51
> >        2.75 |        493        0.49      100.00
> > ------------+-----------------------------------
> >       Total |    100,000      100.00
> >
> > _pctile t, nq(100)
> > ret li
> >
> >                 r(r99) =  2.475399255752564
> >                 r(r95) =  1.697941184043884
> >                 r(r90) =  1.30941379070282
> >
> >
> > On Thu, Nov 13, 2008 at 12:21 PM, Victor M. Zammit <vmz@vol.net.mt>
> wrote:
> > > t-table.I understand that you need a number of random tries ,much
> bigger
> > > than that of forty thousand,to come to convergence.So  I was
> wondering
> if
> > > that that constraint with memory ,could be handled.
> > >
> > >
> > > ----- Original Message -----
> > > From: "Austin Nichols" <austinnichols@gmail.com>
> > > To: <statalist@hsphsun2.harvard.edu>
> > > Sent: Thursday, November 13, 2008 4:09 PM
> > > Subject: Re: st: Re: Memory
> > >
> > >
> > >> Victor M. Zammit:
> > >> I also don't understand the point of this exercise--but you should
> > >> read -help simul- and try e.g.
> > >>
> > >> prog makez, rclass
> > >>  syntax [, obs(integer 31)]
> > >>  clear
> > >>  set obs `obs'
> > >>  gen a = invnorm(uniform())
> > >>  ttest a=0
> > >>  return scalar t=r(t)
> > >> end
> > >> simul, reps(100000) seed(123): makez
> > >>
> > >>
> > >> On Thu, Nov 13, 2008 at 9:56 AM, Nick Cox <n.j.cox@durham.ac.uk>
> wrote:
> > >> > I still don't understand what you are trying to do. But I can
> comment
> on
> > >> > your code.
> > >> >
> > >> > You are looping round 40,000 times and writing a single result to
> 40,000
> > >> > data files. Then you are looping round to put all those 40,000
> data
> > >> > files in one.
> > >> >
> > >> > I'd do that directly this way using just one extra file:
> > >> >
> > >> > clear
> > >> > set obs 31
> > >> > gen a = .
> > >> > tempname out
> > >> > postfile `out' t using myresults.dta
> > >> > qui forval i = 1/40000 {
> > >> >        replace a = invnorm(uniform())
> > >> >        ttest a = 0
> > >> >        post `out' (r(t))
> > >> > }
> > >> > postclose `out'
> > >> >
> > >> > I still doubt 40,000 is anywhere big enough to get an answer.
> > >> >
> > >> > Nick
> > >> > n.j.cox@durham.ac.uk
> > >> >
> > >> > Victor M. Zammit
> > >> >
> > >> > * a} The data that I have is from generating random samples of
> whatever
> > >> > size,in this case of size 31,from a normally
> distributed,infinitely
> > >> > large,
> > >> > population; ie
> > >> >
> > >> > local i = 1
> > >> >
> > >> > while `i'<= 40000 {
> > >> >
> > >> > drop _all
> > >> >
> > >> > set obs 31
> > >> >
> > >> > gen a = invnorm(uniform())
> > >> >
> > >> > qui ttest a = 0
> > >> >
> > >> > replace a = r(t) in 1
> > >> >
> > >> > keep in 1
> > >> >
> > >> > save a`i',replace
> > >> >
> > >> > local i = `i'+1
> > >> >
> > >> > }
> > >> >
> > >> > * I use 40000 due to memory constraint.Appending the a[i]'s
> together
> > >> > gives
> > >> > me a variable of 40000 observations ,ie
> > >> >
> > >> > use a1,clear
> > >> >
> > >> > local i = 2
> > >> >
> > >> > while `i'<= 40000 {
> > >> >
> > >> > append using a`i'.dta
> > >> >
> > >> > local i = `i'+1
> > >> >
> > >> > }
> > >> >
> > >> > save ais40000,replace
> > >> >
> > >> > * b) From ais40000.dta I get the density <= 1.31, presumably to
> get
> the
> > >> > density of 90% , <= 1.697 to get the density of 95% etc
> etc,according
> to
> > >> > the
> > >> > official ttable, ie
> > >> >
> > >> > capture program drop density
> > >> >
> > >> > program define density
> > >> >
> > >> > use ais40000,clear
> > >> >
> > >> > count if a<= `1'
> > >> >
> > >> > di " density >=" "`1'" " = " r(N)/40000
> > >> >
> > >> > end
> > >> >
> > >> > density 1.31
> > >> >
> > >> > density 1.697
> > >> >
> > >> > density 2.042
> > >> >
> > >> > density 2.457
> > >> >
> > >> > density 2.75
> > >> >
> > >> > * For smaller degrees of freedom,the discrepancy is much higher.I
> would
> > >> > like
> > >> > to know how if it is at all possible to resolve memory constraint
> .
> > *
> > *   For searches and help try:
> > *   http://www.stata.com/help.cgi?search
> > *   http://www.stata.com/support/statalist/faq
> > *   http://www.ats.ucla.edu/stat/stata/
> >
>
> *
> *   For searches and help try:
> *   http://www.stata.com/help.cgi?search
> *   http://www.stata.com/support/statalist/faq
> *   http://www.ats.ucla.edu/stat/stata/
>
>
>
> *
> *   For searches and help try:
> *   http://www.stata.com/help.cgi?search
> *   http://www.stata.com/support/statalist/faq
> *   http://www.ats.ucla.edu/stat/stata/
>

*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/



© Copyright 1996–2014 StataCorp LP   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index