[rust-dev] Appeal for CORRECT, capable, future-proof math, pre-1.0

Daniel Micay danielmicay at gmail.com
Sun Jan 12 13:08:22 PST 2014

On Sun, Jan 12, 2014 at 3:55 PM, Gábor Lehel <glaebhoerl at gmail.com> wrote:
> On Sat, Jan 11, 2014 at 11:18 AM, Marijn Haverbeke <marijnh at gmail.com>
> wrote:
>> I am not aware of an efficient way to provide
>> automatic-overflow-to-bignum semantics in a non-garbage-collected
>> language, without also imposing the burden of references/move
>> semantics/etc on users of small integers. I.e. integers, if they may
>> hold references to allocated memory can no longer sanely be considered
>> a simple value type, which doesn't seem like it'd be a good idea for
>> Rust.
>> If there is a good solution to this, I'd love to find out about it.
> This is a very good point. My thinking w.r.t. checking for overflow used to
> be that if you're taking the performance hit for it, you might as well
> expand to a big integer instead of failing, because if the use case is
> indexing into an array, then the bounds check will catch it, and in other
> use cases it's likely preferable.

An overflow check adds a branch and the pipeline serialization from
reading the carry/overflow flag. Expanding to a big integer requires another

Both can be compiled in such a way that they are always predicted
correctly, but it still adds very significant overhead.

> But checked integers are POD and big/expanding ones aren't, which is a big
> advantage for the former.

In almost every case, the overflow-checked integer hitting the bound
is still going to be a bug. An expanding type is actually

> All of this really makes me wish hardware supported overflow checking in an
> efficient manner natively. If it can throw an exception for divide by zero,
> then why not overflow? (...I expect someone will actually have an answer for
> this.)

Rust can't make use of this kind of CPU support, since it wants to use
(Rust-specific) unwinding.

More information about the Rust-dev mailing list