value objects

David Herman dherman at mozilla.com
Wed Mar 21 10:04:22 PDT 2012


On Mar 20, 2012, at 3:55 PM, Allen Wirfs-Brock wrote:

> I don't actually see how this is substantially different, in concept from Sam Ruby's decimal work.

I assume you mean this?

    https://mail.mozilla.org/pipermail/es-discuss/2008-September/007466.html

The important difference here is that I'm saying the typeof should be "object".

> The main difference is that you have many more types all of which have to be dealt with (including coercions/promotions, etc.)

No, that misses the point. The main difference is that we have a new class of object that can represent value types (rather similar to how regular expressions were introduced into JS), so that we can preserve compatibility with code that assumes there are exactly six typeof types, and so that we can then be free to introduce as many value types as we want.

We cannot add new typeof types willy-nilly. We could conceivably add just one more, say, "value", for value types. But even that would break many programs. I say, let's not add *any* new typeof types; let's just add a new [[Class]] of object for value types.

> I don't see why you aren't going to run into exactly the same issues (but more so, because of multiple types) that Sam did. ... Of course, I don't think Sam's issues were insurmountable, we just couldn't deal with them with in a timely manner.

Sam was only working on decimal. Meanwhile there's demand for 64-bit ints. I'm proposing an approach that can a) be used to provide multiple new value types and b) be grown in the future to provide more value types or even user-extensible ones. I'm going to work on it. If we can't get it done, we can't get it done.

> When we abandoned the decimal work, we said we  hoped we could find a generalization that would permit a more open ended set of value types.  I don't think we should give up on that before we have really tried.

As the strawman says, this approach doesn't give up on that. This proposal is to do the hard work to spec what needs to be specified for the high-priority value types we actually need. From there we can consider generalizing to a user-extensible version. But note that generalized operator overloading will probably not be uncontroversial. Let's solve the important needs first and future-proof for generalization, but let's not stall everything to hold out for perfection.

> Conversion of operators into method calls and dynamic double dispatch for operand type promotion are well understood techniques. I've noodled around with these concepts for ES and I don't see why they wouldn't work just fine.  When I get a chance I can walk you through how it could work.

Yeah, I know how these would work. My point is that we can get 64-bit ints and bignums into JS, in a backwards-compatible manner, without breaking the conceptual simplicity of only having one type of primitive number in the language, and without having to reach consensus on arbitrary user-extensible operator overloading. And we can do it without closing the door to the more general system later.

> To me, your key insight seems to be that operations upon well specified immutable (include key methods)  object types (I hate to use that term because it is so loaded) can be optimized in a fairly straightforward manner.  We probably should all be saying to ourselves, well of course...

Um, OK... I'm not trying to get this published in a peer-reviewed journal. I'm just trying to solve problems.

> I agree that this is an area where we should be applying on going attention, but I don't think we need to close the door on an extensible set of numeric types yet.

As I say, not closing the door. Just solving problems incrementally. The perfect (and non-existent) is the enemy of the good.

Dave



More information about the es-discuss mailing list