Consistent decimal semantics

Brendan Eich brendan at
Wed Sep 3 21:35:13 PDT 2008

On Sep 3, 2008, at 9:16 PM, Sam Ruby wrote:

> You act as if logic applies here.  (And, yes, I'm being  
> facetious).  If that were true, then a === b => 1/a === 1/b.  But  
> that's not the case with ES3, when a=0 and b=-0.

I didn't say "for all" -- the particular hard cases are 0m converting  
to true, in light of real code that assumes typeof x == "object" && ! 
x => x is null; and 1.1m == 1.1 or other power-of-five problems. No  
universal quantification here -- particular problems, counterexamples.

> 1.1 and 1.1m, despite appearing quite similar actually identify two  
> (marginally) different points on the real number line.  This does  
> not cause an issue with #2 above, any more than substituting the  
> full equivalents for these constants causes an issue.

P754 says otherwise (3). (3) in light of (2) says typeof 1.1m !=  
"number". Assume otherwise:

typeof 1.1m == "number"
typeof 1.1  == "number"
1.1m == 1.1 && 1.1m === 1.1

but this contradicts (3) -- P754, the holy writ :-P.

> Similarly, the only apparent issue with #1 is the assumption that ! 
> (new Decimal(0)) is true.  But !(new Number(0)) is false, so there  
> is no reason that the same logic couldn't apply to Decimal.

You are seriously proposing that !0m is false? Interesting.

> If typeof(0m) is "number", then !0m should be true.
> If typeof(0m) is "object", then !0m should be false.
> If typeof(0m) is "decimal", then we are free to decide what !0m  
> should be.
> My preference is for typeof(0m) to be "decimal" and for !0m to be  
> true.

That's my preference now too, but based on more than aesthetics.

> Yes, in theory code today could be depending on typeof returning a  
> closed set of values.  Is such an assumption warranted?  Actually  
> that question may not be as important as whether anything of value  
> depends on  this assumption.

Agreed, this is the hard case that will lose in court.

> If we are timid (or responsible, either way, it works out to be the  
> same), then we should say no new values for typeof should ever be  
> minted, and that means that all new data types are of type object,  
> and none of them can ever have a value that is treated as false.

That's too timid, I agree.

> If we are bold (or foolhardy), then we should create new potential  
> results for the typeof operator early and often.

I think we have to.

If !0m is false then generic code that works over falsy values will  
fail on decimal, or when number-typed values are promoted or migrated  
to decimal. This seems worse to me than the closed-set-of-typeof- 
codes violation. Not just based on real code that does generic falsy  
testing via 'if (!x) ...', but because that is the right minimal way  
to test falsy-ness. Ok, that was an aesthetic judgment in part, but  
it's also based on minimizing terms and generalizing generic code fully.

Anyway, my logic exercise was not meant to be a proposition from  
Wittgenstein, but I found it persuasive. Probably Waldemar has a  
trump card to throw in favor of "decimal" as the new typeof code.


More information about the Es-discuss mailing list