# Why decimal?

Christian Plesner Hansen christian.plesner.hansen at gmail.com
Wed Jun 24 01:49:11 PDT 2009

```>> I don't know, the user doesn't say why this inaccuracy is a problem.
>> It sounds like he's just generally unhappy that arithmetic is
>> approximate.  Decimal is approximate too.
>
> That's true at very extreme margins only! Decimal does not fail to round
> power-of-five products so badly, and I think you know this.

I know, and just after what you quoted I said "unless we know we'll
stay in base 10".  It's a fact that if you venture outside of base 10
you'll get more accurate results using k-bit binary than k-bit
decimal.

>
> js> 79.46-39.96
> 39.49999999999999

A single approximate calculation doesn't say anything about how the
user would like arithmetic to behave generally.  Using any type of
floating-point numbers properly is not easy and the right solution
depends on your setting.  If this is dollar amounts maybe calculating
with integer cents is a better solution.  It's impossible to tell from
one isolated calculation what the right solution is since we don't
know why the inaccuracy is a problem.  We can guess that maybe decimal
is the solution this user is looking for.  Our experience from asking
around within google, and looking at the one decimal library
available, says that the interest in decimal is very limited in
practice.

> These dollars and cents calculations are quite common, and decimal handles
> them correctly with full precision and no rounding errors.

In which contexts do people perform these casual dollars-and-cents
calculations in JS?  I would expect the vast majority of
dollars-and-cents calculations to happen in the kind of setting I
argued in my original mail would most likely take place on the server
even if there was decimal support on the client.

>>  Unless the example is set
>> in a context where we know we'll stay in base 10, such as financial,
>> decimal arithmetic will only give you less accuracy per bit.
>
> Why are you talking about "bits" here?

To signify that I'm comparing 64-bit binary with 64-bit decimal or
128-bit ditto, but not 64-bit binary and 128-bit decimal.

>> As far as I can see none of the reports collected under 5856 ask for
>> decimal, what they ask for is accurate arithmetic.  Decimal doesn't
>> provide that.  If people were asking for decimal arithmetic because
>> they needed it for some particular financial or scientific application
>> then that would be different.
>
> See above. I think you are missing something fundamental about the problem
> reported at that bug. Decimal does fix it for the use-cases reported there
> and in dups.

Which kind of arithmetic is appropriate depends on what your setting
is.  Hardly any of these bug reports give any context so there is no
way to tell if decimal will solve people's problems.  Here's a few
dups where it definitely won't:

https://bugzilla.mozilla.org/show_bug.cgi?id=356566
"The Javascript modulus arithmetic has gross errors when large numbers
are used "

https://bugzilla.mozilla.org/show_bug.cgi?id=369803
"When doing lots of small scale and translation equations I start
getting rounding errors."

-- Christian
```