Statalist


[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]

Re: st: Error 909: obs must be between 0 and NNN


From   Orvalho Joaquim Augusto <[email protected]>
To   [email protected]
Subject   Re: st: Error 909: obs must be between 0 and NNN
Date   Sat, 26 Jan 2008 13:22:47 +0200

I do not know much about your trouble. But there are special issues related to memory manipulations with some operating systems especially when we need use more than 1Gb of RAM.

Check this:
http://www.stata.com/support/faqs/win/winmemory.html

Good luck
Orvalho J Augusto


Sergiy Radyakin wrote:

Dear Statalist users,

I wonder if anyone could suggest any explanation to the following: I
am trying to set the number of observations, but Stata refuses to do
so, even though it (probably) should allow me to do this. More
strangely however, is that Stata contradicts itself as I illustrate
below. Here are the commands as they were typed:


// --- Try to get as much memory as possible ----
. set mem 1100m
op. sys. refuses to provide memory
r(909);

// --- Settle for a bargain ----
. set mem 1040m
Current memory allocation
// ------- standard memory allocation report was here
---------------------------

. set obs 160000000
obs must be between 0 and 109051879
r(198);

. set obs 109051879
no room to add more observations
// ----- standard explanation was here
------------------------------------------------
r(901);

. set obs 100000000
no room to add more observations
// ----- standard explanation was here
------------------------------------------------
r(901);

So the value that Stata _itself_ suggests as valid (109 051 879) is
incorrect (and by a significant margin, at least 9%, but may be more).

So here are my puzzles:

1. how does Stata determine how many observations can fit into the
currently allocated memory when record size is zero? (which is exactly
our case - no variables are defined after set mem something). As Stata
had suggested, I tried to think "of Stata's data area as the area of a
rectangle", but still could not divide 1GB by zero.

2. Why the number it determines is wrong?

3. Is it wrong only in the case of the zero-length records, or will it
also fail to properly compute maximum number of observations in cases
when record size is not trivial?

4. Given a file of size X on disk, how to compute the memory that I
need to set to:
  a) simply open the file and be able to see it's contents
  b) create an additional variable Y of type Z after the file is open.
Here of course we are talking about large files ~1GB, close to the
real-World limits of Stata on the 32bit Windows machines. And the
answer might be that the file may not be opened at all.

5. This is more a wish, then a puzzle, since I am 99% sure the answer
is negative: many Stata commands (both base and user-written) create
temporary variables during their work. But it is hard to tell how many
of those will be created. When working on the margin this becomes
important. Is there any reference table for this purpose? Is there any
way to automatically monitor the number of created variables, and
collect the largest value, say in profiler, or elsewhere?

[D] obs is uninformative on any of the above puzzles.

Setup: Windows Server 2003, Stata9.2MP.


Thank you,
    Sergiy Radyakin
*
*   For searches and help try:
*   http://www.stata.com/support/faqs/res/findit.html
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


*
*   For searches and help try:
*   http://www.stata.com/support/faqs/res/findit.html
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/



© Copyright 1996–2024 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index