[rust-dev] Appeal for CORRECT, capable, future-proof math, pre-1.0

Palmer Cox palmercox at gmail.com
Mon Jan 13 17:50:03 PST 2014


On Mon, Jan 13, 2014 at 12:18 PM, Tobias Müller <troplin at bluewin.ch> wrote:

> Daniel Micay <danielmicay at gmail.com> wrote:
> > On Sun, Jan 12, 2014 at 1:23 PM, Tobias Müller <troplin at bluewin.ch>
> wrote:
> >> Isaac Dupree
> >> <ml at isaac.cedarswampstudios.org> wrote:
> >>> In general, Rust is a systems language, so fixed-size integral types
> are
> >>> important to have.  They are better-behaved than in C and C++ in that
> >>> signed types are modulo, not undefined behaviour, on overflow.  It
> could
> >>> be nice to have integral types that are task-failure on overflow as an
> >>> option too.  As you note, bignum integers are important too; it's good
> >>> they're available.  I think bignum rationals would be a fine additional
> >>> choice to have (Haskell and GMP offer them, for example).
> >>
> >> Wrapping overflow is just as bad as undefined behavior IMO.
> >
> > Do you know what undefined behavior is? It doesn't mean unspecified.
>
> True, but despite beeing so often cited it won't format your hard disk
> (even in C).
> The result of an integer addition will always be an integer in every
> compiler I know so in this specific case I don't fear the UB.


I don't think that's quite accurate. Its true that the result of undefined
behavior is unlikely to be that your harddrive is formatted. However, you
aren't guaranteed to end up with an integer in any compiler that I'm aware
of either. Its subtle, but important to remember that the instruction that
actually overflows the integer register isn't really the undefined
behavior. Its not that a program is operating in a defined mode and then
the integer overflow occurs and then the program transitions into an
undefined mode. Instead, a program that will execute an integer overflow at
some point in the future is as the mercy of undefined behavior from the
very first instruction. The compiler, knowing that you aren't allowed to
overflow an integer, uses that information to do a variety of
optimizations, including dead code elimination. Since overflowing an
integer is undefined, the compiler assumes that you won't do it and will
use that assumption to eliminate as much code as possible. So, the end
result could be that an entire branch of the program is eliminated. See
http://blog.regehr.org/archives/213 (look for "Type 3 Functions) for an
example. So, the end result of integer overflow undefined behavior might
not an actual overflow since the compiler could remove the add instruction
along with whatever else it thinks it can. This could result in a program
that works fine in debug modes, but once you crank up the optimization
level, it fails to run. Or, it could result in a program that runs fine on
a particular compiler version, but then fails mysteriously after an update.
Or, as in the example, it could result in a program that runs fine, but
silently doesn't do an important check.


> >> I cannot remember a single case of using signed integers where wrapping
> >> would make any sense.
> >
> > It often makes sense in codecs, hashing algorithms and cryptography.
>
> I'm sure there exist many cases where _unsigned_ int overflow makes sense.
> For _signed_ integers I'm a bit sceptical but I am no expert in that field.
>
> In any case it should not be the default but rather 'opt-in'.
>
> But this is not what I meant. Let me rephrase it differently:
> Assume that signed int overflow is UB (like in C). That means, all actual
> overflows at runtime have to be considered bugs.
> Now I cannot imagine any such case (bug) where guaranteed wrapping would
> actually behave nicer.
>
> > If you don't have clear bounds and don't want modular arithmetic, you
> > need big integers.
>
> Or proper input validation. The type defines the bounds.
>
> >> And you lose some optimization opportunities.
> >
> > It's treated as undefined because there are more optimization
> > opportunities that way.
>
> That's what I wanted to say. If you guarantee wrapping overflow you lose
> those opportunities.
>
> >> So why not take the path of the rust memory management and enforce
> bounds
> >> statically? It would need annotations on the types, like lifetimes, but
> it
> >> would be very rusty. Like C but safe.
> >
> > Rust isn't supposed to be really hard to write. Complex dependent typing
> would
>
> I'm not sure that this would be so complex. At least not more than the
> lifetime system. Is simple arithmetics on the bounds.
>
> In reality (rust beeing a systems PL) fixed width ints _will_ be used, and
> I am sure that overflow will often just be neglected. So why not enforce it
> statically?
>
> Tobi
>
> _______________________________________________
> Rust-dev mailing list
> Rust-dev at mozilla.org
> https://mail.mozilla.org/listinfo/rust-dev
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.mozilla.org/pipermail/rust-dev/attachments/20140113/7acda4eb/attachment-0001.html>


More information about the Rust-dev mailing list