Generator issue: exceptions while initializing arguments

Brendan Eich brendan at
Mon Sep 10 14:33:43 PDT 2012

Tobie Langel wrote:
> On Sep 10, 2012, at 9:48 PM, Jason Orendorff<jason.orendorff at>  wrote: 
>> On Mon, Sep 10, 2012 at 12:50 PM, Kevin Smith<khs4473 at>  wrote:
>>>>>     function f(x=EXPR1, y=EXPR2) { BODY }
>>>>>     ===>
>>>>>     function f(x, y) {
>>>>>         if (x === void 0) x = EXPR1;
>>>>>         if (y === void 0) y = EXPR2;
>>>>>         BODY
>>>>>     }
>>> I'm not so sure - the desugaring above would mean that default expressions 
>>> would have visibility across curly-brace boundaries, which I find to be
>>> quite surprising.
>> It is surprising.  It could even bite unwary programmers.  But in what
>> scope do you propose to evaluate default-param expressions?
> In their lexical scope.

Which one? If you mean let*, so that

   function f(a = a, b = b*a) { return [a, b]; }
   var a = 42;
   console.log(f()) // [42, NaN]
   console.log(f(1)) // [1, NaN]
   console.log(f(1,2)) // [1, 2]

works, that's one "lexical scope" approach. But it is novel to JS and 
seen nowhere else in the language (in particular, there's no orthogonal 
analogue of let* in Scheme).

If you mean let with temporal dead zone (a kind of let rec, or let with 
hoisting and no read before initialization), then the reads of a and b 
above before they are initialized for f() and f(1) would throw.

What's more, as Allen reminded me tonight, "lexical scope" can be made 
to work with magic arguments objects, which alias the lexical bindings 
through spec and implementation magical back doors. But this violates 
the spirit of lexical scope. Do you really want to claim formal 
parameters have lexical scope given such junk as:

   function f(a, b = arguments[0]) { arguments[0] = 99; return [a, b]; }
   console.log(f(1)) // [99, 1]
   console.log(f(1, 2)) // [99, 2]


We could forbid arguments from being used to mean f's activation's 
arguments object in parameter default values. But that still leaves 
arguments[0] = 99 setting "lexically scoped" x to 99. This is required 
by backward compatibility, and I argue it makes x var-like, not let-like 
or lexically scoped -- even though one could work magic via arguments 
object getters and setters, in the spec.

Implementations want let and anything let-like to be optimized assuming 
no such aliasing. I think users want that too, for bug reduction and 


More information about the es-discuss mailing list