[Python-Dev] Using defaultdict as globals/locals for eval() (original) (raw)
Martin v. Loewis martin@v.loewis.de
25 Oct 2002 18:50:02 +0200
- Previous message: [Python-Dev] Using defaultdict as globals/locals for eval()
- Next message: [Python-Dev] Using defaultdict as globals/locals for eval()
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
Geert Jansen <geertj@boskant.nl> writes:
Is there a solution to this problem in sight?
No, although contributions are welcome.
Or altenately, is there a way I can find out which variables are used inside a compiled code block so I can initialize non-specified variables? I have a vague memory that the nested scopes feature has to determine at compile time which variables are being used in a code block.
Sure: you can parse the code, build an AST tuple (or list), and traverse that.
This, of course, is off-topic for python-dev.
Regards, Martin
- Previous message: [Python-Dev] Using defaultdict as globals/locals for eval()
- Next message: [Python-Dev] Using defaultdict as globals/locals for eval()
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]