Consistent decimal semantics

Sam Ruby rubys at intertwingly.net
Wed Sep 3 21:16:07 PDT 2008


Brendan Eich wrote:
> On Aug 25, 2008, at 5:25 PM, Waldemar Horwat wrote:
> 
>> Brendan Eich wrote:
>>>> - Should decimal values behave as objects (pure library
>>>> implementation) or as primitives?
>>>>
>>>> If they behave as objects, then we'd get into situations such as  
>>>> 3m !=
>>>> 3m in some cases and 3m == 3m in other cases.  Also, -0m != 0m would
>>>> be necessary.  This is clearly unworkable.
>>> What should be the result of (typeof 3m)?
>> "decimal".  It should not be "object" because it doesn't behave  
>> like other objects!
> 
> Clearly, you are correct (but you knew that ;-).
> 
> Specifically,
> 
> 1. typeof x == "object" && !x => x == null.
> 
> 2. typeof x == typeof y => x == y <=> x === y
> 
> 3. 1.1m != 1.1 && 1.1m !== 1.1 (IEEE P754 mandates that binary floats  
> be convertible to decimal floats and that the result of the  
> conversion of 1.1 to decimal be 1.100000000000000088817841970012523m)
> 
> Therefore typeof 1.1m != "object" by 1, or else 0m could be mistaken  
> for null by existing code.
> 
> And typeof 1.1m != "number" by 3, given 2.
> 
> This leaves no choice but to add "decimal".

:-)

You act as if logic applies here.  (And, yes, I'm being facetious).  If 
that were true, then a === b => 1/a === 1/b.  But that's not the case 
with ES3, when a=0 and b=-0.

1.1 and 1.1m, despite appearing quite similar actually identify two 
(marginally) different points on the real number line.  This does not 
cause an issue with #2 above, any more than substituting the full 
equivalents for these constants causes an issue.

Similarly, the only apparent issue with #1 is the assumption that !(new 
Decimal(0)) is true.  But !(new Number(0)) is false, so there is no 
reason that the same logic couldn't apply to Decimal.

  - - -

 From where I sit, there are at least three potentially logically 
consistent systems we can pick from.

We clearly want 0 == 0m to be true.  Let's get that out of the way.

Do we want 0 === 0m to be true?  If so, typeof(0m) should be "number". 
Otherwise, it should be something else, probably "object" or "decimal".

If typeof(0m) is "number", then !0m should be true.
If typeof(0m) is "object", then !0m should be false.
If typeof(0m) is "decimal", then we are free to decide what !0m should be.

My preference is for typeof(0m) to be "decimal" and for !0m to be true. 
  But that is only a preference.  I could live with typeof(0m) being 
"object" and !0m being false.  I'm somewhat less comfortable with 
typeof(0m) being "number", only because it implies that methods made 
available to Decimal need to be made available to Number, and at some 
point some change is to any existing class, no matter how apparently 
innocuous, will end up breaking something important.

Yes, in theory code today could be depending on typeof returning a 
closed set of values.  Is such an assumption warranted?  Actually that 
question may not be as important as whether anything of value depends on 
  this assumption.

If we are timid (or responsible, either way, it works out to be the 
same), then we should say no new values for typeof should ever be 
minted, and that means that all new data types are of type object, and 
none of them can ever have a value that is treated as false.

If we are bold (or foolhardy), then we should create new potential 
results for the typeof operator early and often.

- Sam Ruby



More information about the Es-discuss mailing list