[Python-Dev] Re: anonymous blocks (original) (raw)

Guido van Rossum gvanrossum at gmail.com
Thu Apr 28 02:19:08 CEST 2005


[Phillip]

It's not unlike David Mertz' articles on implementing coroutines and multitasking using generators, except that I'm adding more "debugging sugar", if you will, by making the tracebacks look normal. It's just that the how requires me to pass the traceback into the generator. At the moment, I accomplish that by doing a 3-argument raise inside of 'events.resume()', but it would be really nice to be able to get rid of 'events.resume()' in a future version of Python.

I'm not familiar with Mertz' articles and frankly I still fear it's head-explosive material. ;-)

I think maybe I misspoke. I mean adding to the traceback so that when the same error is reraised, the intervening frames are included, rather than lost.

In other words, IIRC, the traceback chain is normally increased by one entry for each frame the exception escapes. However, if you start hiding that inside of the exception instance, you'll have to modify it instead of just modifying the threadstate. Does that make sense, or am I missing something?

Adding to the traceback chain already in the exception object is totally kosher, if that's where the traceback is kept.

My point was mainly that we can err on the side of caller convenience rather than callee convenience, if there are fewer implementations. So, e.g. multiple methods aren't a big deal if it makes the 'block' implementation simpler, if only generators and a handful of special template objects are going need to implement the block API.

Well, the way my translation is currently written, writing next(itr, arg, exc) is a lot more convenient for the caller than having to write

# if exc is True, arg is an exception; otherwise arg is a value
if exc:
    err = getattr(itr, "__error__", None)
    if err is not None:
        VAR1 = err(arg)
    else:
        raise arg
else:
    VAR1 = next(itr, arg)

but since this will actually be code generated by the bytecode compiler, I think callee convenience is more important. And the ability to default error to raise the exception makes a lot of sense. And we could wrap all this inside the next() built-in -- even if the actual object should have separate next() and error() methods, the user-facing built-in next() function might take an extra flag to indicate that the argument is an exception, and to handle it appropriate (like shown above).

> > So, I guess I'm thinking you'd have something like tpblockresume and > > tpblockerror type slots, and generators' tpiternext would just be the > > same as tpblockresume(None). > >I hadn't thought much about the C-level slots yet, but this is a >reasonable proposal.

Note that it also doesn't require a 'next()' builtin, or a next vs. next distinction, if you don't try to overload iteration and templating. The fact that a generator can be used for templating, doesn't have to imply that any iterator should be usable as a template, or that the iteration protocol is involved in any way. You could just have resume/error matching the tpblock* slots. This also has the benefit of making the delineation between template blocks and for loops more concrete. For example, this: block open("filename") as f: ... could be an immediate TypeError (due to the lack of a resume) instead of biting you later on in the block when you try to do something with f, or because the block is repeating for each line of the file, etc.

I'm not convinced of that, especially since all generators will automatically be usable as templates, whether or not they were intended as such. And why shouldn't you be allowed to use a block for looping, if you like the exit behavior (guaranteeing that the iterator is exhausted when you leave the block in any way)?

-- --Guido van Rossum (home page: http://www.python.org/~guido/)



More information about the Python-Dev mailing list