Wouldn't being bolder on new constructs save the "breaking the web" costs for future"?

Brendan Eich brendan at mozilla.com
Tue Jan 8 21:17:13 PST 2013


Herby Vojčík wrote:
> Brendan Eich wrote:
>> You are describing path dependency, which afflicts us all, in various
>> evolving systems including JS.
>>
>> We cannot keep resetting ES6 to include and think through and make
>> consistent things such as nil. Sorry, that trades off badly against
>> developers' time -- they need the good stuff coming already in top
>> engines, which Allen is drafting.
>
> If you would explain this better, I'd be glad - it is too dense for me 
> to understand (especially "include and think thorugh" and "they need 
> the good stuff ..."). Thanks.

Adding some new feature to JS requires not just evaluating the design of 
the thing itself, but how it interacts with the rest of the language.

This is hard! It's not just a question of direct combinatorial 
complexity, but subtle indirect interactions, human factors pitfalls, 
things that often require implementation and user testing to find.

>> And since we're only human, along the dependent paths we walk, missteps
>> happen. If we can back up and take a better path in time for ES6, we 
>> will.
>
> I understand there is a legacy, but I am deeply convinced there is 
> hardly better path through which fix(es) can be made than through new 
> construct. Because old ones must work as expected.

Yes, can't break the web.

Making the new too different from the old, while keeping the old 
working, is certainly possible. But we do not want to make a big fork in 
the road. As pundit-of-malapromisms Yogi Berra said, "/When you/ come to 
a /fork in the road/, /take it/".

If we fork JS too much with new runtime semantics, and developers "take 
the fork", then transpilers become full compilers, the cognitive load on 
all developers grows quite a bit.

JS does not need to fork in order to grow to what it should become. This 
is a vague statement, I grant you. But Harmony's means to its ends is

* Minimize the additional semantic state needed beyond ES5.

We agreed to this in 2008 July at Oslo, I still think it holds.One 
reason for this agreement, above and beyond trying to avoid adding too 
much combinatorial and deep human-user-testing-required complexity: not 
trying to make JS into "a different language".

Again, that's vague. Languages evolve, change is a constant over deep 
time. But consider ES4's attempts to add optional (gradual/hybrid) type 
annotations that could be statically checked.That was one of the things 
the Harmony ends-and-means agreements precluded, intentionally.

So we are intentionally limiting how much JS might grow in its kernel 
semantics. I think that's a good thing.

>> Another thing we try to do: make incremental progress edition by
>> edition, based on experience ("pave the cowpaths", lessons from nearby
>> languages, championed designs that are prototyped early, e.g. proxies),
>> but not close important doors. This is called future-proofing.
>
> What are those "champions" and "championed" you use in meeting notes 
> often? Thanks again. :-|

Proxies: tomvc and markm
Modules: samth and dherman
Max-Min Classes: awb, after dherman, after markm & others before

We avoid design-by-committee via the champions model, where one or two 
people design a given extension. TC39 curates championed extensions 
based on use-case-based demand, PLT theory and best practices, and the 
quality of the work itself.

> Would changing [[Construct]] semantics for `class` (at least half a 
> step) to clearly follow .prototype.constructor being a "too far ahead 
> one path"? It gives free new/super consistency and free default 
> constructors (and insight into class/constructor separated worldview)?

Allen seems to have addressed the present-tense use-cases here, without 
making class identity other than constructor identity.

/be


More information about the es-discuss mailing list