Thoughts on IEEE P754
mjs at apple.com
Sat Aug 23 13:13:27 PDT 2008
On Aug 23, 2008, at 6:54 AM, Mike Cowlishaw wrote:
>> Maciej Stachowiak <mjs at apple.com> wrote:
>>> Finally, I'd like to take a poll: Other than people working on
>>> at IBM and people on the EcmaScript committee, is there anyone on
>>> list who thinks that decimal adds significant value to EcmaScript?
>>> so, please speak up. Thanks.
>> I'm on the EcmaScript committee, but I'll answer anyway. My
>> is that decimal support is wildly out of line with the stated goals
>> ES3.1 (cleanup and minimal critical enhancements), and was only
>> brought under consideration because of IBM's naked threat to vote
>> against any spec that did not include it. In the era of competing
>> specs, buying IBM's vote was worth a rush job.
> I am very sorry if you got that impression.
But I don't think you can say the impression is false, because IBM did
in fact make such a threat, which gives the appearance of tainting all
> Our (IBM) customers tell us
> that this *is* a minimum critical enhancement. IBM hardware and
> products also didn't want to do this (there are always more
> or 'higher buzz' things to do than fix the basic technology). It's
> something that needs to be done, and the less time one spends debating
> that, the less stressful and less expensive it will be. Decimal
> arithmetic is needed because almost all of humanity uses decimal
> arithmetic, and when the computing model matches the human model,
> 'surprises' occur and the whole issue can then be ignored, because the
> devices 'just work right'. That's exactly in line with Apple's
> philosophy, I believe.
I do agree that binary floating point can be confusing to those who do
not really understand how it works (which is most programmers). But it
is not yet at all clear that having both binary and decimal floating
point will be less confusing.
> Incidentally, the original 754-1985 committee wanted the standard
> then to
> use decimal (hence IEEE 854). It's an accident of history, mostly,
> one of the drivers of that standard was a chip that was too far
> along to
> change from binary to decimal. We're using binary floating-point
> largely because of the need to save a few transistors in 1981 --
> that's no
> longer a good enough reason.
>> Furthermore, even though decimal doesn't seem like a completely
>> useless feature, I believe it has been given far disproportionate
>> weight relative to its actually value to the wide range of possible
>> Web applications, perhaps because it is of particular interest to Web
>> applications that directly handle money. However, that is either not
>> present or only a minor part of many of today's popular Web
>> applications on the public Web.
> I wish that were true. In practice, the applications that survive are
> those that ... make money.
Here are a few highly popular Web applications:
- Google search
- Yahoo search
- Google Maps
Of these, Flickr is the only one (to my knowledge) that that even
accepts money directly from the user, and even in that case it only
accepts canned currency amounts. I agree there are some much more
commerce-oriented sites that could benefit which are very popular in
their own right, such as Amazon or eBay.
>> To my view, Decimal is much less important overall than, say,
>> ByteArray or some other effective way to represent binary data, which
>> is not in ES3.1 and was even rejected from ES4 (at least as then
>> spec'd). There's a lot more popular Web applications that would be
>> interested in manipulating binary data efficiently than care about
>> calculating prices.
> [No particular comment on this, except to note that scripting
> such as Rexx can manipulate binary data directly, and this turns out
> to be
> a minefield. (Big-endian vs. little-endian, etc.) It's much better
> to go
> up a level and handle things in network byte order, represented in hex
> (etc.). This can then be made more performant at a lower level if
Manipulating binary data in hex is unlikely to be adequate for many of
the use cases.
>> Finally, while Decimal seems like it will be useful to knowledgeable
>> developers dealing with financial data, it seems likely to confuse
>> average developer more than it will help. At least, a fair amount of
>> the discussion so far has made me feel confused, and I'd like to
>> I have more than the average understanding of machine arithmetic
>> for a
> This isn't about machine arithmetic, that's exactly the point. It's
> human arithmetic, and machines working in the same way. There are, of
> course, plenty of places (HPC for example) where performance is all
> matters -- and for that, binary/machine arithmetic is fine. But it's
> alien to most of the inhabitants of this planet. :-)
There's nothing human about 10 printing as 1e1 or about the fine
distinction between 1.00m and 1.000000m, or about how .1 without the
magical m suffix might convert to decima as a very surprising valuel,
or about decimal numbers possibly being unusable as array indices.
>> Now that we are beyond the era of competing specs and in the era of
>> Harmony, perhaps we could reconsider the value of buying IBM's vote,
>> and incorporate enhancements to the ECMAScript number model in a
>> thoughtful, unrushed way, after appropriate consideration of the
>> merits and use cases.
> Adding decimal to ECMAScript was proposed -- and agreed -- for future
> addition in 1998/1999 (see the TC39 minutes from then). We have now
> appropriately considered that for about ten years, and the IEEE 754
> committee (with over 150 contributors) has agreed, after 7+ years of
> discussion, a very stable decimal arithmetic model. This has been a
> thoughtful and unrushed process.
I don't think the rules for decimal arithmetic per se seem rushed,
be rushed and full of strange pitfalls. Even JSON (a much technically
simpler and rather well-specified format and technology) took much
longer to design than the timeline expected for decimal.
More information about the Es-discuss