An array destructing specification choice

Allen Wirfs-Brock allen at
Mon Nov 7 10:03:23 PST 2011

On Nov 7, 2011, at 9:21 AM, Andreas Rossberg wrote:

> On 7 November 2011 17:34, Allen Wirfs-Brock <allen at> wrote:
>>>  It is just another way to
>>> silently inject an `undefined' that is tedious to track down.  We
>>> already have too many of those...
>> It is how the language currently behaves in all situations where an object is needed but a primitive values is provided.
>>  We want consistency in language design, not a hodgepodge of special cases and different rules.
> Hm, I don't quite buy that. There are plenty of places in ES today
> where we don't convert but throw, e.g. "in", "instanceof", various
> methods of Object, etc.  Destructuring arguably is closely related to
> operators like "in".  Implicit conversion would violate the principle
> of least surprise for either, IMHO.

True, "in" and "instanceof" don't follow the rules.  I note that they were added in ES3 and I have to wonder if they aren't another case of features being added without sufficient thought being given to maintaining consistency of behavior throughout the language. I don't know, I wasn't there.

 The same can be said for a few cases in the Object functions that were added for ES5.  If I had the same depth of understanding of the internals of the language as I do now, I probably would have objected to those variances.

> I agree that consistency is a nice goal, but it seems like that train
> is long gone for ES. Also, if consistency implies proliferating an
> existing design mistake then I'm not sure it should have the highest
> priority.

Perhaps not the highest priority, but still a priority.

As the specification writer, I have in my head (yes, it would be good to write the down) a set of routine and consistent behaviors that I apply as I compose the specification algorithms.  I think this is similar to the conceptual understanding of the language that an expert JS programmer uses as they write code. Whenever sometime deviates from that norm, it has to be given special consideration.  For a new feature, my starting assumption is always that it will follow the norm.  Increasing the number of deviations from the norm doesn't necessarily make the language better but it certainly makes it less internally consistent and harder to reason about.

Whether or not a particular consistent behavior was a "design mistake" is usually a subjective evaluation and I'm not sure  if it is particularly relevant.  The core language is what it was and that is what we have to work with.  Most such  "mistakes" can't be pervasively fixed.  It isn't at all clear to me that spot fixing only new occurrences of such "mistakes" makes JS a better language.


More information about the es-discuss mailing list