Revisiting Decimal

Kris Zyp kris at sitepen.com
Thu Jan 15 15:05:06 PST 2009


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
 

>
>
>> You are saying there latent hard-to-find bugs because people believe
>> that JSON somehow implies that the sum of {"p":1.1, "q":2.2} must be
>> 3.3000000000000003 ?
>
> I never wrote any such thing.
>
> Please look at the previous messages again. If C1 uses double to
> decode JSON from S1 but C2 uses decimal, then results can differ
> unexpectedly (from S1's point of view). Likewise, encoding decimal
> using JSON's number syntax also breaks interoperation with
> implementations using double to decode and (re-)encode.
>
>
>> If people are returning 3.3, then the argument
>> that JSON numbers are universally treated computed as binary is not
>> valid. Is there a less hand-wavy way of stating that?
>
> I don't know what "treated computed as binary" means, even if I
> delete one of "treated" and "computed". JSON number syntax may be
> encoded from decimals by some implementations, and decoded into
> decimals too. This is not interoperable with implementations that
> transcode using doubles. Period, full stop.
>
>
>> I thought JSON serialization and typeof results could be considered
>> separate issues.
>
> You brought up Dojo code examples including Dojo's JSON codec as
> evidence that defining typeof 1.1m == "number" would relieve you of
> having to change that codec while at the same time preserving
> correctness. I replied showing that the preserving correctness claim
> in that case is false, and the relief from having to evolve the
> codec was an obligation.
>
> We then talked more about JSON than typeof, but the two are related:
> in both JSON *implementations* and your proposed typeof 1.1m ==
> "number" && typeof 1.1 == "number" world, incompatible number
> formats are conflated. This is a mistake.
Yes, they are related, but we can make separate decisions on them. I
more ambivalent on typeof 1.1m than on the what seems to me to be a
more obvious mistake of throwing on JSON serialization of decimals.

>
>
>>> You're arguing by assertion that rounding errors due to double's
>>> finite binary precision, which are the most reported JS bug at
>>> https://bugzilla.mozilla.org, are somehow insignificant when JSON
>>> transcoding is in the mix. That's a bold assertion.
>> The issue here is relying on another machine to do a computation. I
>> have trouble believing that all these people that are experiencing
>> rounding errors are then using these client-side computations for
>> their server.
>
> Please. No one wrote "all these people". We're talking about subtle
> latent and future bugs, likelihoods of such bugs (vs. ruling them
> out by not conflating incompatible number types). Correctness is not
> a matter of wishful thinking or alleged "good enough" current-code
> behavior.
>
>
>> The compensation for rounding errors that we are
>> concerned are usually going to be kept as close to the error as
>> possible. Why would you build a client-server infrastructure around it?
>
> People do financial stuff in JS. No medical equipment or rocket
> control yet, AFAIK (I could be wrong). I'm told Google Finance uses
> integral double values to count pennies. It would not be surprising
> if JSON transcoding were already interposed between parts of such a
> system. And it should be possible to do so, of course -- one can
> always encode bignums or bigdecimals in strings.
>
> What's at issue between us is whether the default encoding of
> decimal should use JSON's number syntax. If someone builds
> client-server infrastructure, uses JSON in the middle, and switches
> from double today to decimal tomorrow, what can go wrong if we
> follow your proposal and encode decimals using JSON number syntax?
> Assume the JSON is not in the middle of a closed network where one
> entity controls the version and quality of all peer software. We
> can't assume otherwise in the standard.
>
>
>>> What should JSON.parse use then, if not double (binary)? JSON.parse
>>> is in ES3.1, and decimal is not.
>> It should use double. I presume that if a "use decimal" pragma or a
>> switch was available, it might parse to decimal, but the default would
>> be double, I would think.
>
> Good, we agreed on decoding to double already but it's great to
> confirm this.
>
>
>>> which breaks round-tripping, which breaks interoperation.
>> JSON doesn't round-trip JS, and it never will.
>
> That's a complete straw man. Yes, Nan and the infinities won't round
> trip. But number syntax in JSON per the RFC, in combination with
> correct, Steele and Gay
> (ftp://ftp.ccs.neu.edu/pub/people/will/retrospective.pdf)conformant
> <ftp://ftp.ccs.neu.edu/pub/people/will/retrospective.pdf%29conformant> dtoa
> and strtod code, can indeed round-trip finite values. This should be
> reliable.
>
>
>> I presume that if a receiver had a "use decimal" pragma
>> they could count as opt-in to parsing numbers into decimal and then
>> you could round-trip decimals, but only if the sender was properly
>> encoding decimals as JSON numbers (decimals).
>
> Yeah, only if. Receiver makes it wrong. Nothing in the over-the-wire
> data requires the receiver to use decimal, or fail if it lacks
> decimal support.
>
> You wrote in your last message:
>
> I am not asserting that JSON decoding should automatically convert
> JSON numbers to binary, only that JSON encoding should serialize
> decimals to numbers.
>
> This is likely to create real bugs in the face of decoders that lack
> decimal. There is zero likelihood of such bugs if we don't
> incompatibly encode decimal using JSON number syntax when adding
> decimal to a future standard.
>
>
>> Encoding the decimals as strings is far worse.
>
> Throwing by default -- in the absence of explicit opt-in by the
> encoder-user -- is far better.
>
> It's still up to the producer of the data to worry about tagging the
> decimal type of the JSON number, or hoping that the data stays in
> its silo where it's implicitly typed as decimal.But we won't have
> accidents where implicitly -- by default -- decimal users encode
> JSON data that is incorrectly decoded.
>
>
>>> We are not condemned to repeat history if we pay attention to what
>>> went before. JSON implementations in future ES specs cannot by
>>> default switch either encoding or decoding to use decimal instead
>>> of number.
>> The decimal number has been around much longer than the computer. Are
>> saying that a particular language type has more permanence?
>
> I think you know exactly what I'm saying. One (lingua franca, French
> as the common diplomatic language) particular format is better than
> many. And we are stuck with double today. So we cannot start
> encoding decimals as JSON numbers tomorrow. Please stop ducking and
> weaving and address this head on. If you really endorse "receiver
> makes it right", give a spirited and explicit defense.
JSON already encodes in decimal. Do you want a defense of how a
receiver should make right what is already right? We can't argue about
whether JSON should have a more explicit type system. JSON is frozen.

Are we really making any progress on these point by point arguments?
If so I could continue the point-by-point discussion, but it seems
like they are drifting away from the main issue. All decimal use is
opt-in anyway, there is no breakage for existing code when the VM is
upgraded. So the main issue what will be most reasonable and sensible
for users who have explicitly opted to use decimals. If a user writes
JSON.stringify({price:2m - 0.01m}) it seems by far most intuitive that
they want to serialize to "{\"price\":1.99}". That we have a decimal
in JSON and we wouldn't encode ES's decimal with that format just
seems bizarre to me, even after reading all the arguments.

Thanks,
Kris
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
 
iEYEARECAAYFAklvwSIACgkQ9VpNnHc4zAz6HgCeOkF9AFiu81Hsr8w2ZxEb5nVE
DCkAoLmBonA1bL2QjmyrviWICVP2dk0y
=RtDQ
-----END PGP SIGNATURE-----



More information about the Es-discuss mailing list