Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

st: RE: RE: RE: AW: Decimal Precision with Destring


From   "Martin Weiss" <[email protected]>
To   <[email protected]>
Subject   st: RE: RE: RE: AW: Decimal Precision with Destring
Date   Mon, 19 Apr 2010 19:57:36 +0200

<>

I think you should -gen- your percent changes as -double-s as well...


HTH
Martin


-----Original Message-----
From: [email protected]
[mailto:[email protected]] On Behalf Of Nick Cox
Sent: Montag, 19. April 2010 19:36
To: [email protected]
Subject: st: RE: RE: AW: Decimal Precision with Destring

I think your main problem is upstream, i.e. Excel is rounding -pcpie-,
so nothing in Stata can put the lost digits back for you. 

Note that decimal precision is not really the issue, as Stata is
necessarily holding _binary_ approximations. 

Nick 
[email protected] 

Sutton, Bennett W.

Martin,

Owing to the greater expertise of you and Nick, I have repeated two
suggestions from you and found that in fact Stata is specifying the
precision to 15 decimals, but now have a better understanding of my
problem:

* note pcpie is str19
.destring pcpie, gen(test1) force
.format test1 %20.0g
.gen double test2 = real(pcpie)
.format test2 %20.0g

Indeed I am getting the precision to 15 decimal places.

.list year pcpie test1 test2 if year >= 2007 & year <= 2008 & ifscode ==
313

      | year                 pcpie                test1
test2 |
 
|----------------------------------------------------------------------|
1161. | 2007   116.477290339629000   116.47729033962899
116.47729033962899 |
1162. | 2008   121.777007050082000   121.77700705008201
121.77700705008201 |

However note that test1 and test2 have decimal precision that does not
exist in the string variable (pcpie).  On account of this when I
calculate a percent change of the of test1 and test2 I am getting
results that round up to 4.6 instead of down to 4.5 in year 2008. 


.gen pch1 = (test1/L1.test1-1)*100
.format pch1 %20.0g
.gen pch2 = (test2/L1.test2-1)*100
.format pch2 %20.0g
.list year pcpie pch1 pch2 if year >= 2007 & year <= 2008 & ifscode ==
313

      | year                 pcpie                 pch1
pch2 |
 
|----------------------------------------------------------------------|
1161. | 2007   116.477290339629000   2.8499999046325684
2.8499999046325684 |
1162. | 2008   121.777007050082000   4.5500001907348633
4.5500001907348633 |


Note: taking the growth rate of numbers in pcpie you get a number that
rounds down to 4.5.

I know these are insanely small hairs to be splitting, but unfortunately
this work is going into a publication and I need to make sure that the
rounded result will be 4.5 and not 4.6.  (not just here but in many,
many other calculations.)

Martin Weiss

"Neither option you've sent are working for me."

What does "not working" mean? Note the FAQ...

Sutton, Bennett

I'm using stata SE 10.  Is there a setting somewhere which may be
controlling the default decimal precision?

Martin Weiss

-destring- seems to work as well:

*************
clear*

inp str17 myvar
2.343541098765432
2.398784389359001
3.219439049039405
3.199038538208222
end

destring myvar, gen(mynewvar)
format mynewvar %18.0g
l, noo
*************

Martin Weiss

This code does seem to work, using your second approach:

*************
clear*

inp str17 myvar
2.343541098765432
2.398784389359001
3.219439049039405
3.199038538208222
end

list, noo

gen double mynewvar=real(myvar)
format mynewvar %18.0g
l, noo
*************

Sutton, Bennett W.

I found one post on this from 5 years ago.  But no resolution to the
issue
was posted.  Wondering if anyone has since encountered and surmounted
this
problem:

I am insheet-ing a .csv file with decimal precision set at 15 places in
excel.  But after insheeting into stata, the series are data type
double,
but only have precision to 10 places.  I'm working with hyperinflation
countries and the lack of precision is producing some incorrect results
in
calculations.

I have also tried reading the data in as string variables which at least
preserves the 15 digit precision, but destring-ing the data results in
loss
of precision again to 10 decimal places.


*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/

*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/


© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index