JSON numbers (was: Revisiting Decimal)

Sam Ruby rubys at intertwingly.net
Fri Jan 16 19:46:05 PST 2009


Brendan Eich wrote:
> On Jan 15, 2009, at 7:28 PM, Sam Ruby wrote:
> 
>> On Thu, Jan 15, 2009 at 9:24 PM, Brendan Eich <brendan at mozilla.com> 
>> wrote:
>>>
>>> JSON's intended semantics may be arbitrary precision decimal (the RFC is
>>> neither explicit nor specific enough in my opinion; it mentions only
>>> "range", not "precision"), but not all real-world JSON codecs use 
>>> arbitrary
>>> precision decimal, and in particular today's JS codecs use IEEE double
>>> binary floating point. This "approximates" by default and creates a 
>>> de-facto
>>> standard that can't be compatibly extended without opt-in.
>>
>> You might find the next link enlightening or perhaps even a pleasant 
>> diversion:
>>
>> http://www.intertwingly.net/stories/2002/02/01/toInfinityAndBeyondTheQuestForSoapInteroperability.html 
>>
>>
>> Quick summary as it applies to this discussion: perfection is
>> unattainable (duh!) and an implementation which implements JSON
>> numbers as quad decimal will retain more precision than one that
>> implements JSON numbers as double binary (duh!).
> 
> DuhT^2 ;-).
> 
> But more than that: discounting the plain fact that on the web at least, 
> SOAP lost to JSON (Google dropped its SOAP APIs a while ago), do you 
> draw any conclusions?
> 
> My conclusion, crustier and ornier as I age, is that mixed-mode 
> arithmetic with implicit conversions and "best effort" "approximation" 
> is a botch and a blight. That's why I won't have it in JSON, encoding 
> *and* decoding.

My age differs from your by a mere few months.

My point was not SOAP specific, but dealt with interop of such things as 
dates and dollars in a cross-platform setting.

My conclusion is that precision is perceived as a quality of 
implementation issue.  The implementations that preserve the most 
precision are perceived to be of higher quality than those that don't.

I view any choice which views binary64 as preferable to decimal128 as 
choosing *both* botch and blight.

Put another way, if somebody sends you a quantity and you send back the 
same quantity (i.e., merely round-trip the data), the originator will 
see it as being unchanged if their (the originator's) precision is less 
than or equal to the partner in this exchange.  This leads to an natural 
ordering of implementations from most-compatible to least.

A tangible analogy that might make sense to you, and might not.  Ever 
try rsync'ing *to* a Windows box?  Rsync from windows to windows works 
just fine.  Unix to unix also.  As does Windows->Unix->Windows.  But 
Unix->Windows->Unix needs fudge parameters.  Do you really want to the 
the Windows in this equation?  :-)

- Sam Ruby

P.S.  You asked my opinion, and I've given it.  This is something I have
   an opinion on, but not something I view as an egregious error if the
   decision goes the other way.


More information about the Es-discuss mailing list