[Python-ideas] Possible PEP 380 tweak (original) (raw)
Guido van Rossum guido at python.org
Mon Oct 25 17:13:19 CEST 2010
- Previous message: [Python-ideas] textFromMap(seq , map=None , sep='' , ldelim='', rdelim='')
- Next message: [Python-ideas] Possible PEP 380 tweak
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
[Changed subject]
On 2010-10-25 04:37, Guido van Rossum wrote:
This should not require threads.
Here's a bare-bones sketch using generators: [...]
On Mon, Oct 25, 2010 at 3:19 AM, Jacob Holm <jh at improva.dk> wrote:
If you don't care about allowing the funcs to raise StopIteration, this can actually be simplified to: [...]
Indeed, I realized this after posting. :-) I had several other ideas for improvements, e.g. being able to pass an initial value to the reduce-like function or even being able to supply a reduce-like function of one's own.
More interesting (to me at least) is that this is an excellent example of why I would like to see a version of PEP380 where "close" on a generator can return a value (AFAICT the version of PEP380 on http://www.python.org/dev/peps/pep-0380 is not up-to-date and does not mention this possibility, or even link to the heated discussion we had on python-ideas around march/april 2009).
Can you dig up the link here?
I recall that discussion but I don't recall a clear conclusion coming from it -- just heated debate.
Based on my example I have to agree that returning a value from close() would be nice. There is a little detail, how multiple arguments to StopIteration should be interpreted, but that's not so important if it's being raised by a return statement.
Assuming that "close" on a reducecollector generator instance returns the value of the StopIteration raised by the "return" statements, we can simplify the code even further:
def reducecollector(func): try: outcome = yield except GeneratorExit: return None while True: try: val = yield except GeneratorExit: return outcome outcome = func(outcome, val) def parallelreduce(iterable, funcs): collectors = [reducecollector(func) for func in funcs] for coll in collectors: next(coll) for val in iterable: for coll in collectors: coll.send(val) return [coll.close() for coll in collectors] Yes, this is only saving a few lines, but I find it much more readable...
I totally agree that not having to call throw() and catch whatever it bounces back is much nicer. (Now I wish there was a way to avoid the "try..except GeneratorExit" construct in the generator, but I think I should stop while I'm ahead. :-)
The interesting thing is that I've been dealing with generators used as coroutines or tasks intensely on and off since July, and I haven't had a single need for any of the three patterns that this example happened to demonstrate:
- the need to "prime" the generator in a separate step
- throwing and catching GeneratorExit
- getting a value from close()
(I did have a lot of use for send(), throw(), and extracting a value from StopIteration.)
In my context, generators are used to emulate concurrently running tasks, and "yield" is always used to mean "block until this piece of async I/O is complete, and wake me up with the result". This is similar to the "classic" trampoline code found in PEP 342.
In fact, when I wrote the example for this thread, I fumbled a bit because the use of generators there is different than I had been using them (though it was no doubt thanks to having worked with them intensely that I came up with the example quickly).
So, it is clear that generators are extremely versatile, and PEP 380 deserves several good use cases to explain all the API subtleties.
BTW, while I have you, what do you think of Greg's "cofunctions" proposal?
-- --Guido van Rossum (python.org/~guido)
- Previous message: [Python-ideas] textFromMap(seq , map=None , sep='' , ldelim='', rdelim='')
- Next message: [Python-ideas] Possible PEP 380 tweak
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]