[Python-Dev] Preventing recursion core dumps (original) (raw)

Guido van Rossum guido@beopen.com
Fri, 11 Aug 2000 07:47:09 -0500


I'm looking at preventing core dumps due to recursive calls. With simple nested call counters for every function in object.c, limited to 500 levels deep recursions, I think this works okay for repr, str and print. It solves most of the complaints, like:

class Crasher: def str(self): print self print Crasher() With such protection, instead of a core dump, we'll get an exception: RuntimeError: Recursion too deep

So far, so good. 500 nested calls to repr, str or print are likely to be programming bugs. Now I wonder whether it's a good idea to do the same thing for getattr and setattr, to avoid crashes like: class Crasher: def getattr(self, x): return self.x Crasher().bonk Solving this the same way is likely to slow things down a bit, but would prevent the crash. OTOH, in a complex object hierarchy with tons of delegation and/or lookup dispatching, 500 nested calls is probably not enough. Or am I wondering too much? Opinions?

In your examples there's recursive Python code involved. There's already a generic recursion check for that, but the limit is too high (the latter example segfaults for me too, while a simple def f(): f() gives a RuntimeError).

It seems better to tune the generic check than to special-case str, repr, and getattr.

--Guido van Rossum (home page: http://www.pythonlabs.com/~guido/)