Statalist


[Date Prev][Date Next][Thread Prev][Thread Next][Date index][Thread index]

RE: st: Weights


From   "Martin Weiss" <[email protected]>
To   <[email protected]>
Subject   RE: st: Weights
Date   Thu, 1 May 2008 16:02:12 +0200

Thanks for all replies,


yet the issue is not at all resolved. I had the file broken along columns
overnight and -compress-ed. As there are slightly more than 600 columns, I
went for 7 package @ approx. 90 each. Later, I tried to -merge- them but the
result is almost exactly the same compared to splitting along the rows and
-append-ing. Setting -mem- to 3G, which is perfectly feasible on 64 bit
machines, I managed to -merge- four out of seven. On the fifth run thru the
-forv- loop, Stata complained about lack of -mem-. Adding up the size of the
7 -compress-ed files yields the size of 5.5 G mentioned in the initial post
in this thread. -Compress-ing did not help much, neither did the -makesmall-
prog which was posted in this thread. With regard to -recast, force- that
sets off alarm bells for me as it might destroy valuable information (which
I cannot check for every var as there are over 600 of them...). I do
actually like the peace of mind that comes with -compress-...
Now, I cannot possibly post anything on the data as they are confidential. I
do notice, though, that SPSS 16.0 manages them @ half the file size and @
amazing speed. So far, I have only tried descriptives, but as things stand,
I may not need much more with this dataset. Not willing to go thru the
frustration of learning the ropes in another package, I went 64 bit
precisely to avoid the quandary I am now in. 
As things stand, and in contrast to one of the posts earlier, there is
something magic about the way data are stored and processed in different
packages...

Martin Weiss
_________________________________________________________________

Diplom-Kaufmann Martin Weiss
Mohlstrasse 36
Room 415
72074 Tuebingen
Germany

Fon: 0049-7071-2978184

Home: http://www.wiwi.uni-tuebingen.de/cms/index.php?id=1130

Publications: http://www.wiwi.uni-tuebingen.de/cms/index.php?id=1131

SSRN: http://papers.ssrn.com/sol3/cf_dev/AbsByAuth.cfm?per_id=669945

-----Original Message-----
From: [email protected]
[mailto:[email protected]] On Behalf Of Sergiy Radyakin
Sent: Thursday, May 01, 2008 3:21 AM
To: [email protected]
Subject: Re: st: Weights

On 4/30/08, Michael Blasnik <[email protected]> wrote:
> ...
> First, Martin should note that SPSS does not have some magical way to hold
> data more compactly than Stata can.  You just need to pick the proper

It depends on the data: SPSS has a compressed file format, which is
more efficient in storing strings, than Stata format. But you are
right that there is no magic in it.

Sergiy Radyakin

> storage types in Stata.  I would guess that there are variables being held
> as doubles or long strings that don't need that much space.  To get the
most
> useful advice, you may want to -describe- one of your dataset chunks so
that
> we can see where the storage is being used up.
>
> I often work with fairly large datasets (~1 GB) and one suggestion that I
> haven't seen mentioned thus far is to use -recast- with the force option
to
> convert doubles to floats when feasible. I find that compress will rarely
> convert a double to a float even though the actual number is stored with
far
> less than float precision.  .
>
> Michael Blasnik
>
>
*
*   For searches and help try:
*   http://www.stata.com/support/faqs/res/findit.html
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/

*
*   For searches and help try:
*   http://www.stata.com/support/faqs/res/findit.html
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/



© Copyright 1996–2024 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index