Revisiting Decimal

Bob Ippolito bob at redivi.com
Thu Jan 15 12:50:25 PST 2009


On Thu, Jan 15, 2009 at 12:44 PM, Bob Ippolito <bob at redivi.com> wrote:
> On Thu, Jan 15, 2009 at 12:32 PM, Kris Zyp <kris at sitepen.com> wrote:
>> -----BEGIN PGP SIGNED MESSAGE-----
>> Hash: SHA1
>>
>>
>>
>> Brendan Eich wrote:
>>> On Jan 15, 2009, at 10:46 AM, Kris Zyp wrote:
>>>
>>>> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
>>>>
>>>> Brendan Eich wrote:
>>>>> On Jan 15, 2009, at 9:47 AM, Kris Zyp wrote:
>>>>>
>>>>>> Where is the loss coming from?
>>>>>
>>>>> Decimal-using peer S1 encodes
>>>>>
>>>>> {p:1.1, q:2.2}
>>>>>
>>>>> Double-using peer C1 decodes, adds, and returns
>>>>>
>>>>> {p:1.1, q:2.2, r:3.3000000000000003}
>>>>>
>>>>> The sender then checks the result using decimal and finds an
>>>>> error. Meanwhile the same exchange between S1 and decimal-using
>>>>> peer C2 succeeds without error.
>>>>>
>>>>> /be
>>>>>
>>>> Exactly, C1 introduces the error by it's use of binary.
>>>
>>> This is not about blame allocation. The system has a problem
>>> because, even though JSON leaves numeric representation
>>> unspecified, higher layers fail to agree. That could be viewed as a
>>> JSON shortcoming, or it could be the fault of the higher layers. I
>>> don't want to debate which is to blame here right now (more below).
>>>
>>>
>>> The point is that all the JS self-hosted JSON implementations I've
>>> seen, and (crucially) the ES3.1 native JSON implementation, use
>>> double, not decimal. This constitutes an interoperation hazard and
>>> it constrains compatible future changes to ES-Harmony --
>>> specifically, we can't switch JSON from double to decimal by
>>> default, when decoding *or* encoding.
>> How do you switch to double or decimal by default on encoding? The
>> input defines it, not any default setting.
>>>
>>>
>>>> The JSON encoding didn't introduce the error. JSON exactly
>>>> represented the data it was given,
>>>
>>> JSON the RFC is about syntax. It doesn't say more than "An
>>> implementation may set limits on the range of numbers" regarding
>>> semantics of implementations.
>>>
>>> Actual implementations that use double and decimal will not
>>> interoperate for all values. That means "not interoperate".
>>>
>>>
>>>> and the decimal decoding and encoding peer refrains from
>>>> introducing errors as well.
>>>
>>> Assuming no errors is nice. (What color is the sky in your world?
>>> :-P) Meanwhile, back on this planet we were debating the best way
>>> to reduce the likelihood of errors when adding decimal to JS. Back
>>> to that debate:
>> 3.3 is exactly the sum of 1.1 and 2.2 without errors as decimal math
>> produces here in the blue sky world (I was going off your example).
>> Utah may have unusually blue skies though, it is the desert :).
>
> Depending on the algorithm that a double-using client side uses to
> print floats as decimal, they may not even be able to retain a decimal
> number even without doing any math operations.
>
>>>> simplejson.dumps(simplejson.loads('{"num": 3.3}'))
> '{"num": 3.2999999999999998}'
>
> simplejson uses the repr() representation for encoding floats. I
> forget the exact inputs for which it is wrong, but Python's str()
> representation for float does not round-trip properly all of the time
> on all platforms.

I was able to dig up a specific example that is reproducible here on
Mac OS X with Python 2.5.2:

>>> float(str(1617161771.7650001)) == 1617161771.7650001
False

Firefox 3.0.5 does a better job here:

>>> parseFloat((1617161771.7650001).toString()) === 1617161771.7650001
true

-bob


More information about the Es-discuss mailing list