yield* desugaring

Allen Wirfs-Brock allen at wirfs-brock.com
Mon May 13 10:24:12 PDT 2013

On May 13, 2013, at 2:07 AM, Andreas Rossberg wrote:

> On 12 May 2013 21:29, Allen Wirfs-Brock <allen at wirfs-brock.com> wrote:
>> First, as a general comment, I don't use direct desugaring within the spec. but instead use the spec. pseudo code formalisms.  This gives me direct access to mechanisms such as internal Completion values and allows me to express behaviors that are difficult or impossible to express via desugarings.
> I could say a few things about why I think this is _not_ actually a
> good approach in general, but that's another discussion...
>> As now specified:
>> 1) I specified yield* such that it will work with any iterator, not just generators. To me, this seems essential.  Otherwise client code is sensitive to other peoples implementation decision (that are subject to change) regarding whether to use a  generator or an object based iterator.  This was easy to accomplish and only requires a one-time behavioral check to determine whether "next" or "send" should be used to retrieve values from the delegated iterator and an behavior guard on invoking "throw" on the delegated iterator.
>> 2) yield* invokes the @@iterator method on its expression to obtain the iterator. This means you can say things like:
>>    yield * [2,4,6,8,10];  //individually yield positive integers <= 10.
> Is that a good idea? I'd be fine with it if you could transparently
> generalise to iterators, but you can't. You need to special-case
> iterators that are generators, and for them the semantics will be
> quite different. For example, they will recursively handle sends and
> throws to the outer generator, whereas for other iterators, where will
> those even go? In a nutshell, what you are suggesting is to break the
> iterator abstraction.

Yes, it's a very good idea.  The easy way for an imperative programmer (there are a few of us in the world) to understand yield* is as a yielding loop over an iterator.  Very straight forward to understand rather than describing it as a "mechanism for composing generators" (from the wiki) which I had no idea what it meant until I carefully studied the desugaring.  At that point it because clear that it was just a yielding loop over an iterator  that for some reason was arbitrarily being restricted to being a generator.

That restriction is what is breaking the iterator abstraction.  The most common use case of generators is for implementing iterators. If I'm implementing an iterator via a generator and I have to perform a inner-iteration over an contained iterable (for example, some sort of flattening operation) the way I code that inner iteration shouldn't depend upon whether or not the implementor of the inner iterable chose to use a generator rather than a stateful object as the iterator. If that is the case, then I am essentially precluded from using yield* in this most common situation.  Instead I could only use yield* in situations where I know that the target object is implemented via a generator.  To me, that is to gross violation of the iteration abstraction and would call into question why we even have yield*.

Could you clarify the special-case handling you have in mind? There is nothing in the wiki proposal desugaring of yield* that guarantees  that the delegated object is an actual generator.  All it requires is a "send" method (and, in that desugaring,  "close" plus "throw" if "throw" is actually invoked). Once a "send" is invoked (whatever it does) you're operating within the delegated object and it doesn't (and shouldn't) know if it was invoked as part of a yield* or a for-of or an explicit method call.

Regarding recursive "sends" to an outer generator.  This shouldn't work, according to the wiki proposal.  When executing a yield* the outer generator must be in the "executing" state. Invoking an inner generator from an yield* via a "send" invocation still leaves the outer generator in the "executing" state.  If the inner generator invokes "send" on the outer generator the "send" will throw because the outer is already in the "executing" state.

Our real problem here seems to be that generators specialize the iterator abstraction in ways that make it hard to use them interchangeably. 

First why do we need "send" at all.  Why not simply allow an argument to be passed to "next"  (of course, it is already allowed) and leave it up to the generator implementation as to whether or not they pay any attention to it.   Clearly a client needs to be aware when they are using a generator that expects to receive a value back from yield so that fact must be documented in the public contract of that generator.  Once that is done, the client can use "next" as easily as they could use "send".  Of course, if people really like the name "send" we could also provide that method for generators with the meaning:
    send(value) {return this.next(value)}

That leaves only "throw" as an issue.  Personally, I'd just make it part of the Iterator interface and provide an Iterator abstract class that provides
   throw(exception) {throw exception}
as the default "throw" implementation so most iterator authors don't even have to think about it. Short of that, I think having an explicit behavior check for "throw" in the yield* algorithm is a very small cost (that only arises if someone  actually invokes the "throw" method on the outer generator) and would take care of most common situation where "throw" is likely to be invoked on an iterator..

> In any case, I think this is a substantial enough change from the
> proposal that it needs consensus first.

That's why I brought it to attention here...

>> 3) yield* yields the nextResult object produced by the inner iterator. No unwrapping/rewrapping required.
>> 4) I haven't (yet) provided a "close" method for generators.  I still think we should.  Unwinding via return is clearly the appropriate semantics for "close". Contrary to some concerns expressed on this thread, I don't think there are any issues with distinguishing "close" triggered returns from actual user level concerns. Such returns (all returns) have to pass as Completion values  through the outer body level of the generator and it is easy enough (at the spec level) for me to tag such returns (using a unique, unobservable internal return value) such that there is no confusion with an actual user level return.
>> I think "close" is an useful and important operation for maintaining finally semantics in two use cases:
>>      a)       for (v of iterable) {...break;...} /* or return or throw or outer continue*/
>> The for-of should should automatically invoke "close" if the loop is terminated before exhausting the iterator.  The "close" needs to be guarded with a behavioral check just like I did for yield*.  I think this is a common case and we really should be doing our best effort to maintain the reliability of finally blocks for this common case.
> A couple of observations:

In my response to Andy I concluded that syntactically restricting yield to not be finally protected is the better solution.


> 1. This again seems to require breaking the iterator abstraction,
> since you have to special-case the described close behaviour for
> iterators that are generators (you cannot close other iterators).
> 2. It amounts to requiring every for-of loop over a generator to be
> wrapped into an implicit try-statement. That is doable, but imposes a
> substantial cost, which we had just removed by retiring StopIteration.
> 3. Whatever way you turn it, close is not a well-behaved API, because
> (a) we cannot guarantee it being called, and (b) a generator could
> actually intercept it in a try-finally and re-yield another result
> (same problem as with the StopIteration exception before). What would
> that mean?
> /Andreas

More information about the es-discuss mailing list