Decimal comparisons

Brendan Eich brendan at mozilla.org
Fri Sep 19 06:21:48 PDT 2008


On Sep 19, 2008, at 8:45 AM, Sam Ruby wrote:

> The motivation for the fourth choice on the first question is to
> produce a value that is is valid JSON, is unlikely to be widely used
> today (by virtue of the capital E), will fall back to binary 64 many
> JSON implementations, and can be used as a signal to produce a decimal
> value in ECMAScript.  I'm merely putting this forward as brainstorming
> at this point, I'm less than enthusiastic about it myself.

http://www.ietf.org/rfc/rfc4627.txt underspecifies enough to allow  
this kind of trick, but it is possible that the de-facto JSON  
standard requires IEEE double at the limit.


> Does the committee feel that it can ever add new values to typeof
> under any circumstances?

Certainly not if there is "opt-in version selection".

Even with "the default version", people on this list have argued that  
adding a new typeof-type does the least harm and probably could be  
pulled off, because most typeof-using code does not test type-code  
values exhaustively. But it's an open question.


> FWIW, my preference is (in order): decimal, object, then number.

I'm with you there. The neo-Platonist camp (:-P) argues for "number"  
based on Real numbers, but machines and ES numbers do not behave like  
Reals. I don't believe "object" is thinkable without !0m => false, or  
widely distributed code will fail on decimal values.


> I think it is preposterous to assume that ECMAScript can never add any
> new data types.

Agreed.


> The version of JSON that is included in the language
> will be aware of the data types supported in that edition of
> ECMAScript.  What json2.js does for decimal, it does for any object
> data type that it doesn't understand.

This is not so clear. JSON is a cross-language, inter-networked data  
standard. It won't be easy to grow in parallel to ES standards. It  
shouldn't grow without versioning, which I'm sure Doug agrees should  
be avoided at all costs.

The risk for JSON is that because it underspecifies, real-world uses  
bind to specific implementations details such as "number" being double.


> As far as what the builtin JSON functionality slated for ECMAScript
> 3.1 does when parsing "[1.1]", a case could be made that producing
> [1.1m] is the closest to the expressing what the data structure
> actually conveys, is readily convertible to binary 64 floating point
> when needed, and typical JSON isn't anywhere near as performance
> critical as vector graphic intensive functions are.

Yeah, you're right that this case could be made from the RFC,  
although the RFC talks only about range restrictions, not radix or  
rounding. But from an interoperating JSON peer today, you might find  
failures to round-trip. That seems bad.

/be


More information about the Es-discuss mailing list