barrier dimension, default dimension

Brendan Eich brendan at
Fri Dec 28 12:13:32 PST 2012

Argh, why must mailman archive + copy/paste result in unreadably long 
lines. Here's the citation again (from

178 nor Normal All allen at CONF --- Must settle scoping 
details for block-scoped bindings

Much discussion here. The issue is whether let and const bindings hoist 
to block top, or start a new implicit scope (the let* or, let's call it, 
C++ rule). The prior work was nicely diagrammed by Waldemar in:

Quoting from Waldemar's message (note the future-proofing for guards):

--- begin quote ---

There are four ways to do this:
A1. Lexical dead zone.  References textually prior to a definition in 
the same block are an error.
A2. Lexical window.  References textually prior to a definition in the 
same block go to outer scope.
B1. Temporal dead zone.  References temporally prior to a definition in 
the same block are an error.
B2. Temporal window.  References temporally prior to a definition in the 
same block go to outer scope.

Let's take a look at an example:

let x = "outer";
function g() {return "outer"}

   function f() { ... x ... g ... g() ... }
   var t = some_runtime_type;
   const x:t = "inner";
   function g() { ... x ... }

B2 is bad because then the x inside g would sometimes refer to "outer" 
and sometimes to "inner".

A1 and A2 introduce extra complexity but doesn't solve the problem.  
You'd need to come up with a value for x to use in the very first call 
to g().  Furthermore, for A2 whether the window occurred or not would 
also depend on whether something was a function or not; users would be 
surprised that x shows through the window inside f but g doesn't.

That leaves B1, which matches the semantic model (we need to avoid 
referencing variables before we know their types and before we know the 
values of constants).

--- end quote ---

In the September 2010 meeting, however, we took a wrong turn (my fault 
for suggesting it, but in my defense, just about everyone did prefer it 
-- we all dislike hoisting!) away from hoisted let and const bindings, 
seemingly achieving consensus for the C++ rule.

Allen, it turned out, did not agree, and he was right. Mixing 
non-hoisting (the C++ rule) with hoisting (function in block must hoist, 
for mutual recursion "letrec" use-cases and to match how function 
declarations at body/program level hoist) does not work. In the example 
above, g's use of x either refers to an outer x for the first call to 
g() in the block, but not the second in the block (and various for the 
indirect call via f()) -- dynamic scope! -- or else the uses before 
|const x|'s C++-style implicit scope has opened must be errors (early or 
not), which is indistinguishable from hoisting.

So at last week's meeting, we finally agreed to the earlier rules: all 
block-scoped bindings hoist to top of block, with a temporal dead zone 
for use of let and const before *iniitalization*.

The initialization point is also important. Some folks wondered if we 
could not preserve var's relative simplicity: var x = 42; is really var 
x; x = 42, and then the var hoists (this makes for insanity within 
'with', which recurs with 'let' in block vs. 'var' of same name in inner 
block -- IIRC we agreed to make such vars that hoist past same-named let 
bindings be early errors).

With var, the initialization is just an assignment expression. A name 
use before that assignment expression has been evaluated results in the 
default undefined value of the var, assuming it was fresh. There is no 
read and write barrier requirement, as there is (in general, due to 
closures) for the temporal dead zone semantics.

But if we try to treat let like var, then let and const diverge. We 
cannot treat const like var and allow any assignment as 
"initialization", and we must forbid assignments to const bindings -- 
only the mandatory initializer in the declaration can initialize. Trying 
to allow the "first assignment to a hoisted const" to win quickly leads 
to two or more values for a single const binding:

   x = 12;
   if (y) return x;
   const x = 3;

The situation with let is constrained even ignoring const. Suppose we 
treat let like var, but hoisted to block top instead of body/program 
top, with use before set reading undefined, or in an alternative model 
that differs from var per temporal dead zone, throwing. So:

   x = 12;
   let x;

would result in either print being called with undefined or an error on 
the use of x before it was set by the assignment expression-statement -- 
those are the two choices given hoisting.

But then:

   x = 12;
   let x;

would result in either 12 being printed or an error being thrown 
assigning to x before its declaration was evaluated.

Any mixture of error with non-error (printing undefined or 12) is 
inconsistent. One could defend throwing in the use-before-assignment 
case, but it's odd. And throwing in both cases is the earlier consensus 
semantics of temporal dead zone with a distinct state for lack of 
initialization (even if the initialization is implicit, e.g., in a 
declaration such as let x; being evaluated). Here "initialization" is 
distinguished from assignment expressions targeting the binding.

Trying to be like var, printing undefined or 12, is possible but 
future-hostile to guards and gratuitously different from const:

   x = 12;
   const G = ...;
   let x ::G = "hi";

We want to be future-proof for guards, and even more important: we want 
to support *refactoring from let to const*. Ergo, only temporal dead 
zone with its barriers is tenable.

There remains an open issue: without closures obscuring analysis, it is 
easy to declare use before initialization within the direct 
expression-statement children of a given block to be early errors, 
rather than runtime errors:

   x = 12;          // can be early error
   print(x);        // can be early error
   function f() {
     return x;      // may or may not be error
   escape(f);       // did this call f?
   let x = 42;
   escape2(f);      // did this call f?

Some on TC39 favor normative specification of early errors for the 
easily-decided cases. Others want runtime-only error checking all around 
and point out how even the easy cases (within straight-line code in the 
block's direct expression-statement children) testing that reaches the 
block will fail fast. The question remains: what if the block is not 
covered by tests?

Dave Herman brought up the let/var at top level equivalence implemented 
in SpiderMonkey, specifically in connection with <script> tags. 
Sketching in pseudo-HTML:

<script type=harmony>
   alert = 12;      // reassign built-in alert

<script type=harmony>
   let alert = 13;  // shadow built-in alert
   var quux = 14;   // this.quux = 14
   let quux = 15;   // alternative: in scope for later scripts?


Dave's point was not to commend the SpiderMonkey equating of let and var 
at top level, but to observe that if "let is the new var", then 
depending on how multiple successive script elements' contents are 
scoped, you may still need to use var in Harmony -- let won't be enough, 
if it binds only within the containing <script> element's scope.

Recall that Harmony removes the global (window in browsers) object from 
the scope chain, replacing it with a lexical environment with 
(generally) writable bindings. Each script starts with a fresh lexical 
environment, although it might be nested (see next paragraph).

For scripts that do not opt into Harmony, there's no issue. The global 
object is on the scope chain and it is used serially by successive 
script elements.

The question for Harmony scripts boils down to: should successive 
Harmony scripts nest lexical scopes in prior scripts' scopes, like 
matryoshka dolls? Or should each script opted into Harmony be its own 
module-like scope, in which case to propagate bindings to later scripts, 
one would have to

<script type=harmony>
   export let quux = 14; // available here and in later scripts

This remains an open question in TC39. Some liked the explicit 'export' 
requirement, the implicit module scope. Others objected that migrating 
code would expect the nested semantics, which was not inherently evil or 

--- end of block scope discussion ---


More information about the es-discuss mailing list