[Python-ideas] Possible PEP 380 tweak (original) (raw)
Jacob Holm jh at improva.dk
Tue Oct 26 03:35:33 CEST 2010
- Previous message: [Python-ideas] Possible PEP 380 tweak
- Next message: [Python-ideas] Possible PEP 380 tweak
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
On 2010-10-25 17:13, Guido van Rossum wrote:
On Mon, Oct 25, 2010 at 3:19 AM, Jacob Holm <jh at improva.dk> wrote:
More interesting (to me at least) is that this is an excellent example of why I would like to see a version of PEP380 where "close" on a generator can return a value (AFAICT the version of PEP380 on http://www.python.org/dev/peps/pep-0380 is not up-to-date and does not mention this possibility, or even link to the heated discussion we had on python-ideas around march/april 2009). Can you dig up the link here? I recall that discussion but I don't recall a clear conclusion coming from it -- just heated debate.
Well here is a recap of the end of the discussion about how to handle generator return values and g.close().
Gregs conclusion that g.close() should not return a value: http://mail.python.org/pipermail/python-ideas/2009-April/003959.html
My reply (ordered list of ways to handle return values in generators): http://mail.python.org/pipermail/python-ideas/2009-April/003984.html
Some arguments for storing the return value on the generator: http://mail.python.org/pipermail/python-ideas/2009-April/004008.html
Some support for that idea from Nick: http://mail.python.org/pipermail/python-ideas/2009-April/004012.html
You're not convinced by Gregs argument: http://mail.python.org/pipermail/python-ideas/2009-April/003985.html
Greg arguing that using GeneratorExit this way is bad: http://mail.python.org/pipermail/python-ideas/2009-April/004001.html
You add a new complete proposal including g.close() returning a value: http://mail.python.org/pipermail/python-ideas/2009-April/003944.html
I point out some problems e.g. with the handling of return values: http://mail.python.org/pipermail/python-ideas/2009-April/003981.html
Then the discussion goes on at length about the problems of using a coroutine decorator with yield-from. At one point I am arguing for generators to keep a reference to the last value yielded: http://mail.python.org/pipermail/python-ideas/2009-April/004032.html
And you reply that storing "unnatural" state on the generator or frame object is a bad idea: http://mail.python.org/pipermail/python-ideas/2009-April/004034.html
From which I concluded that having g.close() return a value (the same on each successive call) would be a no-go: http://mail.python.org/pipermail/python-ideas/2009-April/004040.html
Which you confirmed: http://mail.python.org/pipermail/python-ideas/2009-April/004041.html
The latest draft (#13) I have been able to find was announced in http://mail.python.org/pipermail/python-ideas/2009-April/004189.html
And can be found at http://mail.python.org/pipermail/python-ideas/attachments/20090419/c7d72ba8/attachment-0001.txt
I had some later suggestions for how to change the expansion, see e.g. http://mail.python.org/pipermail/python-ideas/2009-April/004195.html (I find that version easier to reason about even now 1½ years later)
Based on my example I have to agree that returning a value from close() would be nice. There is a little detail, how multiple arguments to StopIteration should be interpreted, but that's not so important if it's being raised by a return statement.
Right. I would assume that the return value of g.close() if we ever got one was to be taken from the first argument to the StopIteration.
What killed the proposal last time was the question of what should happen when you call g.close() on an exhausted generator. My preferred solution was (and is) that the generator should save the value from the terminating StopIteration (or None if it ended by some other means) and that g.close() should return that value each time and g.next(), g.send() and g.throw() should raise a StopIteration with the value. Unless you have changed your position on storing the return value, that solution is dead in the water.
For this use case we don't actually need to call close() on an exhausted generator so perhaps there is some use in only returning a value when the generator is actually running.
Here's a stupid idea... let g.close take an optional argument that it can return if the generator is already exhausted and let it return the value from the StopIteration otherwise.
def close(self, default=None): if self.gi_frame is None: return default try: self.throw(GeneratorExit) except StopIteration as e: return e.args[0] except GeneratorExit: return None else: raise RuntimeError('generator ignored GeneratorExit')
I totally agree that not having to call throw() and catch whatever it bounces back is much nicer. (Now I wish there was a way to avoid the "try..except GeneratorExit" construct in the generator, but I think I should stop while I'm ahead. :-)
The interesting thing is that I've been dealing with generators used as coroutines or tasks intensely on and off since July, and I haven't had a single need for any of the three patterns that this example happened to demonstrate: - the need to "prime" the generator in a separate step - throwing and catching GeneratorExit - getting a value from close() (I did have a lot of use for send(), throw(), and extracting a value from StopIteration.)
I think these things (at least priming and close()) are mostly an issue when using coroutines from non-coroutines. That means it is likely to be common in small examples where you write the whole program, but less common when you are writing small(ish) parts of a larger framework.
Throwing and catching GeneratorExit is not common, and according to some shouldn't be used for this purpose at all.
In my context, generators are used to emulate concurrently running tasks, and "yield" is always used to mean "block until this piece of async I/O is complete, and wake me up with the result". This is similar to the "classic" trampoline code found in PEP 342.
In fact, when I wrote the example for this thread, I fumbled a bit because the use of generators there is different than I had been using them (though it was no doubt thanks to having worked with them intensely that I came up with the example quickly).
This sounds a lot like working in a "larger framework" to me. :)
So, it is clear that generators are extremely versatile, and PEP 380 deserves several good use cases to explain all the API subtleties.
I like your example because it matches the way I would have used generators to solve it. OTOH, it is not hard to rewrite parallel_reduce as a traditional function. In fact, the result is a bit shorter and quite a bit faster so it is not a good example of what you need generators for.
BTW, while I have you, what do you think of Greg's "cofunctions" proposal?
I'll have to get back to you on that.
- Jacob
- Previous message: [Python-ideas] Possible PEP 380 tweak
- Next message: [Python-ideas] Possible PEP 380 tweak
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]