On dropping @names

Claus Reinke claus.reinke at talk21.com
Thu Dec 6 08:25:41 PST 2012


>> I would have preferred if let had not been modeled after var so much, but
>> that is another topic.
> 
> It is as clean as it can get given JS. 

I was hoping for something roughly like

    let lhs = rhs; statements
        // non-recursive, scope is statements

    let { declarations }; statements 
        // recursive, scope is declarations and statements

No hoisting needed to support recursion, no temporal deadzones,
no problem with referring to old x when defining x non-recursively.
And less mixing of declarations and statements.

> And you may be surprised to hear that there are some voices who 
> actually would have preferred a _more_ var-like behaviour.

Well, in the beginning let was meant to replace var, so it had to
be more or less like it for an easy transition. Later, even that transition
was considered too hard, so var an let coexist, giving more freedom
for let design. At least, that is my impression.
 
> The program equivalences are the same, up to annoying additional
> congruences you need to deal with for nu-binders, which complicate
> matters. Once you actually try to formalise semantic reasoning (think
> e.g. logical relations), it turns out that a representation with a
> separate store is significantly _easier_ to handle. Been there, done
> that.

Hmm, I used to find reasoning at term level quite useful (a very long
time ago, I was working on a functional logic language, which had 
something like nu-binders for logic variables). Perhaps it depends on
whether one reasons about concrete programs (program development)
or classes of programs (language-level proofs).

>> gensym is more "imperative" in terms of the simplest implementation:
>> create a globally unused symbol.
> 
> Which also happens to be the simplest way of implementing
> alpha-conversion. Seriously, the closer you look, the more it all
> boils down to the same thing.

Yep. Which is why I thought to speak up when I saw those concerns
in the meeting notes;-)
 
>> Not under lambda-binders, but under nu-binders - they have to.
>>
>> If was explaining that the static/dynamic differences that seem to make
>> some meeting attendees uncomfortable are not specific to nu-scoped
>> variables, but to implementation strategies. For lambda-binders, one can get
>> far without reducing below them, but if one lifts that restriction,
>> lambda-bound variables appear as runtime constructs, too, just as for
>> nu-binders and nu-bound variables (gensym-ed names).
> 
> Not sure what you're getting at precisely, but I don't think anybody
> would seriously claim that nu-binders are useful as an actual
> implementation strategy.

More as a user-level representation of whatever implementation
strategy is used behind the scenes, just as lambda-binders are a
user-level representation of efficient implementations. 

But to clarify the point:

Consider something like: 

    (\x. (\y. [y, y]) x)

Most implementations won't reduce under the \x., nor will they
bother to produce any detailed result, other than 'function'. So
those x and y are purely static constructs.

However, an implementation that does reduce under the \x.
will need to deal with x as a dynamic construct, passing it to
\y. to deliver the result (\x. [x,x]).

Now, the same happens with nu-binders, or private names:
after bringing them in scope, computation continues under
the nu-binder, so there is a dynamic representation (the
generated symbol) of the variable.

My point is that there isn't anything worrying about variables
appearing at dynamic constructs, nor is it specific to private
names - normal variables appearing to be static is just a
consequence of limited implementations. What is static is
the binding/scope structure, not the variables.

Since we mostly agree, I'll leave this here. Perhaps it helps
the meeting participants with their concerns.

Claus
 


More information about the es-discuss mailing list