On dropping @names

Claus Reinke claus.reinke at talk21.com
Mon Dec 10 12:59:27 PST 2012

>>>>    let lhs = rhs; statements
>>>>        // non-recursive, scope is statements
>>>>    let { declarations }; statements        // recursive, scope is
>>>>                                                               // declarations and statements
>>    let { // group of mutually recursive bindings, *no statements*
>>        [x,y] = [42,Math.PI]; // initialization, not assignment
>>        even(n) { .. odd(n-1) .. } // using short method form
>>        odd(n) { .. even(n-1) .. } // for non-hoisting functions
>>        class X { .. }
>>        class C extends S { .. new X( odd(x) ) .. }
>>        class S { }
>>    };
>>    if (even(2)) console.log(  new C() );
> First of all, this requires whole new syntax for the let body. 

Yes and no - I'm borrowing definition syntax from other parts of
the language. Part of the appeal of having a declarations-only block
was to be able to use things like short method form there. The main 
appeal was to have no statements or hoisted constructs between 
declarations in a "letrec".

[by separating recursive and non-recursive forms, the non-recursive
 form would have no rhs-undefineds for the ids being defined, which
 would circumvent the separate, lexical form of dead zone]

> Second, it doesn't eliminate the need for temporal dead zones at all. 

You could well be right, and I might have been misinterpreting what
"temporal dead zone" (tdz) means. 

For a letrec, I expect stepwise-refinement-starting-from-undefined 
semantics, so I can use a binding anywhere in scope but may or may
not get a value for it. While the tdz seems to stipulate that a binding 
for a variable in scope doesn't really exist and may not be accessed 
until its binding (explicit or implicitly undefined) statement is evaluated.

> So what does it gain? The model we have now simply is that every 
> scope is a letrec (which is how JavaScript has always worked, albeit
> with a less felicitous notion of scope).

That is a good way of looking at it. So if there are any statements
mixed in between the definitions, we simply interpret them as
definitions (with side-effecting values) of unused bindings, and

{ let x = 0;
  let z = [x,y]; // (*)
  let y = x; 
  let __ = console.log(z);

is interpreted as

{ let x = 0;
  let z = [x,y]; // (*)
  let _ = x++;
  let y = x;
  let __ = console.log(z);

What does it mean here that y is *dead* at (*), *dynamically*?
Is it just that y at (*) is undefined, or does the whole construct 
throw a ReferenceError, or what? 

If tdz is just a form of saying that y is undefined at (*), then I can
read the whole block as a letrec construct. If y cannot be used 
until its binding initializer statement has been executed, then I 
seem to have a sequence of statements instead.

Of course, letrec in a call-by-value language with side-effects is 
tricky. And I assume that tdz is an attempt to guard against 
unwanted surprises. But for me it is a surprise that not only can 
side-effects on the right-hand sides modify bindings (x++), but 
that bindings are interpreted as assignments that bring in 
variables from the dead.

The discussion of dead zone varieties in


was driven by the interplay of old-style, hoisted, definitions with
initialization desugaring to assignment. The former mimics a letrec,
with parallel definitions, the latter means a block of sequential

So I was trying to get the old-style hoisting and initialization by
assignment out of the picture, leaving a block of recursive
definitions that has a chance of being a real letrec. Perhaps
nothing is gained wrt temporal dead zones. But perhaps this is a 
way to clean up the statement/definition mix, profit from short 
definition forms and provide for non-recursive let without
lexical deadzone.


More information about the es-discuss mailing list