yield* desugaring

Andreas Rossberg rossberg at google.com
Mon May 13 02:07:05 PDT 2013


On 12 May 2013 21:29, Allen Wirfs-Brock <allen at wirfs-brock.com> wrote:
> First, as a general comment, I don't use direct desugaring within the spec. but instead use the spec. pseudo code formalisms.  This gives me direct access to mechanisms such as internal Completion values and allows me to express behaviors that are difficult or impossible to express via desugarings.

I could say a few things about why I think this is _not_ actually a
good approach in general, but that's another discussion...


> As now specified:
>
> 1) I specified yield* such that it will work with any iterator, not just generators. To me, this seems essential.  Otherwise client code is sensitive to other peoples implementation decision (that are subject to change) regarding whether to use a  generator or an object based iterator.  This was easy to accomplish and only requires a one-time behavioral check to determine whether "next" or "send" should be used to retrieve values from the delegated iterator and an behavior guard on invoking "throw" on the delegated iterator.
>
> 2) yield* invokes the @@iterator method on its expression to obtain the iterator. This means you can say things like:
>     yield * [2,4,6,8,10];  //individually yield positive integers <= 10.

Is that a good idea? I'd be fine with it if you could transparently
generalise to iterators, but you can't. You need to special-case
iterators that are generators, and for them the semantics will be
quite different. For example, they will recursively handle sends and
throws to the outer generator, whereas for other iterators, where will
those even go? In a nutshell, what you are suggesting is to break the
iterator abstraction.

In any case, I think this is a substantial enough change from the
proposal that it needs consensus first.


> 3) yield* yields the nextResult object produced by the inner iterator. No unwrapping/rewrapping required.
>
> 4) I haven't (yet) provided a "close" method for generators.  I still think we should.  Unwinding via return is clearly the appropriate semantics for "close". Contrary to some concerns expressed on this thread, I don't think there are any issues with distinguishing "close" triggered returns from actual user level concerns. Such returns (all returns) have to pass as Completion values  through the outer body level of the generator and it is easy enough (at the spec level) for me to tag such returns (using a unique, unobservable internal return value) such that there is no confusion with an actual user level return.
>
> I think "close" is an useful and important operation for maintaining finally semantics in two use cases:
>       a)       for (v of iterable) {...break;...} /* or return or throw or outer continue*/
> The for-of should should automatically invoke "close" if the loop is terminated before exhausting the iterator.  The "close" needs to be guarded with a behavioral check just like I did for yield*.  I think this is a common case and we really should be doing our best effort to maintain the reliability of finally blocks for this common case.

A couple of observations:

1. This again seems to require breaking the iterator abstraction,
since you have to special-case the described close behaviour for
iterators that are generators (you cannot close other iterators).

2. It amounts to requiring every for-of loop over a generator to be
wrapped into an implicit try-statement. That is doable, but imposes a
substantial cost, which we had just removed by retiring StopIteration.

3. Whatever way you turn it, close is not a well-behaved API, because
(a) we cannot guarantee it being called, and (b) a generator could
actually intercept it in a try-finally and re-yield another result
(same problem as with the StopIteration exception before). What would
that mean?

/Andreas


More information about the es-discuss mailing list