Revisiting Decimal

Kris Zyp kris at sitepen.com
Thu Jan 15 09:47:42 PST 2009


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
 


Bob Ippolito wrote:
> On Thu, Jan 15, 2009 at 5:49 AM, Kris Zyp <kris at sitepen.com> wrote:
>
>> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
>>
>>
>>
>> Bob Ippolito wrote:
>>> On Wed, Jan 14, 2009 at 9:32 PM, Kris Zyp <kris at sitepen.com>
>>> wrote:
>>>
>>>> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
>>>>
>>>>
>>>>
>>>> Brendan Eich wrote:
>>>>> On Jan 14, 2009, at 7:38 PM, Kris Zyp wrote:
>>>>>>>>> You need to change this in any case, since even
>>>>>>>>> though the JSON
>>>>>>>> RFC allows arbitrary precision decimal literals,
>>>>>>>> real-world decoders only decode into IEEE doubles.
>>>>>>>> You'd have to encode decimals as strings and decode
>>>>>>>> them using domain-specific (JSON schema based) type
>>>>>>>> knowledge.
>>>>> No, every Java JSON library I have seen
>>>>>
>>>>>> You've seen http://www.json.org/json2.js It and the
>>>>>> json.js alternative JS implementation are popular.
>>>>>> json2.js contains String.prototype.toJSON =
>>>>>> Number.prototype.toJSON = Boolean.prototype.toJSON =
>>>>>> function (key) { return this.valueOf(); };
>>>> Of course, there is no decimal support in ES3, there is no
>>>> other option.
>>>>> parses (at least some, if not all) numbers to Java's
>>>>> BigDecimal.
>>>>>
>>>>>> JSON has nothing to do wth Java, and most implementations
>>>>>> do not have Java BigDecimal, so I don't know how it can
>>>>>> be relevant.
>>>> One of the major incentives for JSON is that it is
>>>> interoperability between languages. If other implementations
>>>> in other languages treat JSON's number as decimal than the
>>>> assertion that I understood you were making that JSON
>>>> number's are being universally expected to be treated as
>>>> binary is not true.
>>>>> JSON's numbers are decimal, languages that support decimals
>>>>>  agree. Dojo _will_ convert JS decimal's to JSON numbers
>>>>> regardless of what path ES-Harmony takes with typeof,
>>>>> whether it requires a code change or not.
>>>>>
>>>>>> That will break interoperatability between current
>>>>>> implementations that use doubles not decimals.
>>>> How so? And how did all the implementations that use decimals
>>>> to interpret JSON numbers not break interoperability?
>>> I'm pretty sure that interoperability is broken when they do
>>> this, it's just very subtle and hard to debug. I have the same
>>> stance as Brendan here, I've even refused to implement the
>>> capability to directly encode decimal as JSON numbers in my
>>> simplejson package (the de facto json for Python). If a user of
>>> the library controls both ends of the wire, they can just as
>>> easily use strings to represent decimals and work with them
>>> exactly how they expect on both ends of the wire regardless of
>>> what their JSON implementation happens to do.
>>>
>>> Imagine the person at the other end of the wire is using
>>> something like JavaScript or PHP. If the message contains
>>> decimals as JSON numbers they can not accurately encode or
>>> decode those messages unless they write their own custom JSON
>>> implementation. How do they even KNOW if the document is
>>> supposed to have decimal precision? What if the other end
>>> passes too many digits (often the case if one side is actually
>>> using doubles)? If they are passed around as strings then
>>> everyone can use the document just fine without any
>>> compatibility issues. The lack of a de jure number precision
>>> and the lack of a date/datetime type are definitely my biggest
>>> grievances with the JSON spec.
>> Specifying number representations would be far more grievous in
>> terms of creating tight-couplings with JSON data. It is essential
>> that implementations are free to use whatever number
>> representation they desire in order to facilitate a loose coupled
>> interchange.
>>
>
> For decimals, I definitely disagree here. In languages that support
>  both float and decimal, it's confusing at best. You can only
> decode as one or the other, and if you try and do any math
> afterwards with the wrong type it will explode. In Python's case
> anyway, you can't even convert a float directly to a decimal
> without explicitly going through string first. simplejson raises an
> exception when you try and encode a decimal unless you tell it
> differently, it makes you decide how they should get represented.
>
> In simplejson it's trivial to transcode decimal to float (or string
> or anything else) during encoding, or to get all numbers back as
> decimal... but you have to do it explicitly. Loosely coupled
> doesn't have to mean lossy.
>
> -bob
>
Where is the loss coming from? JSON isn't doing any computations or
coercions, and ES would only be experiencing a loss when serializing
binary floats to JSON, but not with decimals. Decoders should be
allowed to be explicit and have control over how they choose to
internally represent the numbers they receive from JSON. Decimals in
string format doesn't change that fact, and is far more confusing.
Kris


- --
Kris Zyp
SitePen
(503) 806-1841
http://sitepen.com
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
 
iEYEARECAAYFAklvdr4ACgkQ9VpNnHc4zAyNngCcD7fU1kuQIaQAugtjZZQL7a7X
3lQAnj1RnvhEYFNmtatdmeVN5tBlxuVk
=XS7F
-----END PGP SIGNATURE-----



More information about the Es-discuss mailing list