Revisiting Decimal

Bob Ippolito bob at
Thu Jan 15 12:44:47 PST 2009

On Thu, Jan 15, 2009 at 12:32 PM, Kris Zyp <kris at> wrote:
> Hash: SHA1
> Brendan Eich wrote:
>> On Jan 15, 2009, at 10:46 AM, Kris Zyp wrote:
>>> Brendan Eich wrote:
>>>> On Jan 15, 2009, at 9:47 AM, Kris Zyp wrote:
>>>>> Where is the loss coming from?
>>>> Decimal-using peer S1 encodes
>>>> {p:1.1, q:2.2}
>>>> Double-using peer C1 decodes, adds, and returns
>>>> {p:1.1, q:2.2, r:3.3000000000000003}
>>>> The sender then checks the result using decimal and finds an
>>>> error. Meanwhile the same exchange between S1 and decimal-using
>>>> peer C2 succeeds without error.
>>>> /be
>>> Exactly, C1 introduces the error by it's use of binary.
>> This is not about blame allocation. The system has a problem
>> because, even though JSON leaves numeric representation
>> unspecified, higher layers fail to agree. That could be viewed as a
>> JSON shortcoming, or it could be the fault of the higher layers. I
>> don't want to debate which is to blame here right now (more below).
>> The point is that all the JS self-hosted JSON implementations I've
>> seen, and (crucially) the ES3.1 native JSON implementation, use
>> double, not decimal. This constitutes an interoperation hazard and
>> it constrains compatible future changes to ES-Harmony --
>> specifically, we can't switch JSON from double to decimal by
>> default, when decoding *or* encoding.
> How do you switch to double or decimal by default on encoding? The
> input defines it, not any default setting.
>>> The JSON encoding didn't introduce the error. JSON exactly
>>> represented the data it was given,
>> JSON the RFC is about syntax. It doesn't say more than "An
>> implementation may set limits on the range of numbers" regarding
>> semantics of implementations.
>> Actual implementations that use double and decimal will not
>> interoperate for all values. That means "not interoperate".
>>> and the decimal decoding and encoding peer refrains from
>>> introducing errors as well.
>> Assuming no errors is nice. (What color is the sky in your world?
>> :-P) Meanwhile, back on this planet we were debating the best way
>> to reduce the likelihood of errors when adding decimal to JS. Back
>> to that debate:
> 3.3 is exactly the sum of 1.1 and 2.2 without errors as decimal math
> produces here in the blue sky world (I was going off your example).
> Utah may have unusually blue skies though, it is the desert :).

Depending on the algorithm that a double-using client side uses to
print floats as decimal, they may not even be able to retain a decimal
number even without doing any math operations.

>>> simplejson.dumps(simplejson.loads('{"num": 3.3}'))
'{"num": 3.2999999999999998}'

simplejson uses the repr() representation for encoding floats. I
forget the exact inputs for which it is wrong, but Python's str()
representation for float does not round-trip properly all of the time
on all platforms.


More information about the Es-discuss mailing list