Thoughts on IEEE P754

Mike Cowlishaw MFC at uk.ibm.com
Sat Aug 23 06:54:44 PDT 2008


> Maciej Stachowiak <mjs at apple.com> wrote:
>
> > Finally, I'd like to take a poll: Other than people working on decimal
> > at IBM and people on the EcmaScript committee, is there anyone on this
> > list who thinks that decimal adds significant value to EcmaScript? If
> > so, please speak up. Thanks.
> 
> I'm on the EcmaScript committee, but I'll answer anyway. My impression 
> is that decimal support is wildly out of line with the stated goals of 
> ES3.1 (cleanup and minimal critical enhancements), and was only 
> brought under consideration because of IBM's naked threat to vote 
> against any spec that did not include it. In the era of competing 
> specs, buying IBM's vote was worth a rush job.

I am very sorry if you got that impression.  Our (IBM) customers tell us 
that this *is* a minimum critical enhancement.  IBM hardware and software 
products also didn't want to do this (there are always more 'interesting' 
or 'higher buzz' things to do than fix the basic technology).  It's just 
something that needs to be done, and the less time one spends debating 
that, the less stressful and less expensive it will be.  Decimal 
arithmetic is needed because almost all of humanity uses decimal 
arithmetic, and when the computing model matches the human model, fewer 
'surprises' occur and the whole issue can then be ignored, because the 
devices 'just work right'.  That's exactly in line with Apple's 
philosophy, I believe. 

Incidentally, the original 754-1985 committee wanted the standard then to 
use decimal (hence IEEE 854).  It's an accident of history, mostly, that 
one of the drivers of that standard was a chip that was too far along to 
change from binary to decimal.  We're using binary floating-point today 
largely because of the need to save a few transistors in 1981 -- that's no 
longer a good enough reason.

> Furthermore, even though decimal doesn't seem like a completely 
> useless feature, I believe it has been given far disproportionate 
> weight relative to its actually value to the wide range of possible 
> Web applications, perhaps because it is of particular interest to Web 
> applications that directly handle money. However, that is either not 
> present or only a minor part of many of today's popular Web 
> applications on the public Web.

I wish that were true.  In practice, the applications that survive are 
those that ... make money. 

> To my view, Decimal is much less important overall than, say, 
> ByteArray or some other effective way to represent binary data, which 
> is not in ES3.1 and was even rejected from ES4 (at least as then 
> spec'd). There's a lot more popular Web applications that would be 
> interested in manipulating binary data efficiently than care about 
> calculating prices.

[No particular comment on this, except to note that scripting languages 
such as Rexx can manipulate binary data directly, and this turns out to be 
a minefield.  (Big-endian vs. little-endian, etc.)  It's much better to go 
up a level and handle things in network byte order, represented in hex 
(etc.).  This can then be made more performant at a lower level if 
necessary.]
 
> Finally, while Decimal seems like it will be useful to knowledgeable 
> developers dealing with financial data, it seems likely to confuse the 
> average developer more than it will help. At least, a fair amount of 
> the discussion so far has made me feel confused, and I'd like to think 
> I have more than the average understanding of machine arithmetic for a 
> programmer.

This isn't about machine arithmetic, that's exactly the point.  It's about 
human arithmetic, and machines working in the same way.  There are, of 
course, plenty of places (HPC for example) where performance is all that 
matters -- and for that, binary/machine arithmetic is fine.  But it's 
alien to most of the inhabitants of this planet.  :-)

> Now that we are beyond the era of competing specs and in the era of 
> Harmony, perhaps we could reconsider the value of buying IBM's vote, 
> and incorporate enhancements to the ECMAScript number model in a 
> thoughtful, unrushed way, after appropriate consideration of the 
> merits and use cases.

Adding decimal to ECMAScript was proposed -- and agreed -- for future 
addition in 1998/1999 (see the TC39 minutes from then).  We have now 
appropriately considered that for about ten years, and the IEEE 754 
committee (with over 150 contributors) has agreed, after 7+ years of 
discussion, a very stable decimal arithmetic model.  This has been a 
thoughtful and unrushed process.

Mike







Unless stated otherwise above:
IBM United Kingdom Limited - Registered in England and Wales with number 
741598. 
Registered office: PO Box 41, North Harbour, Portsmouth, Hampshire PO6 3AU








More information about the Es-discuss mailing list