msg1685 - (view) |
Author: Martin v. Löwis (loewis) *  |
Date: 2000-09-28 10:42 |
Compiling s=[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]] gives an error message s_push: parser stack overflow Python 1.5.2 then reports a MemoryError, 2.0b2 a SyntaxError. |
|
|
msg1686 - (view) |
Author: Jeremy Hylton (jhylton)  |
Date: 2000-09-28 19:35 |
What is the desired outcome here? Python is reporting a SyntaxError; it's not crashing. Would you like to increase the max stack size for the parser? What should it's limit be? The current parser stack limit allows eval("["*x+"]"*x) for values of x up to and including 35. I'm not sure why the limit is set this low. I bumped the value of MAXSTACK in parser.c from 500 to 5000 and it accepted the nest list expression for values of x up to 357. |
|
|
msg1687 - (view) |
Author: Martin v. Löwis (loewis) *  |
Date: 2000-09-29 16:08 |
It was confusing that Python would produce a SyntaxError for an obviously-correct script, and that such a small limit was found in the parser. Since the limit is not due to recursion on the C stack: Would a patch removing this limitation be accepted (certainly not for 2.0). The two alternatives I see are to make the array completely dynamic, or to allocate a dynamic array in the stack if the static one overflows. |
|
|
msg1688 - (view) |
Author: Jeremy Hylton (jhylton)  |
Date: 2000-09-30 04:48 |
There is a limit that is based on the C stack, because the parser is recursive descent. If I set the max stack size to 100000, I get a seg fault. I'm not sure if a patch for this would be accepted post 2.0 or not; I'll talk to Guido and see what he thinks. I think we could safely increase the static limit before 2.0, though I'm not 100% certain. What nesting level did your application come up with? I would guess that max stack == 10000 (700 nested lists) should be safe on all reasonable platforms and much more useful. |
|
|
msg1689 - (view) |
Author: Barry A. Warsaw (barry) *  |
Date: 2000-10-01 02:12 |
BTW, JPython gets to 133 nestings (on my RH6.1 system) before a java.lang.StackOverflow gets thrown. Ever heard the old joke "Doctor, it hurts when I do this?" ... |
|
|
msg1690 - (view) |
Author: Martin v. Löwis (loewis) *  |
Date: 2000-10-02 09:23 |
The problem was found when printing an expression like parser.suite("3*4").tolist(), then modifying the string, and feeding the outcome back to Python. It is not a serious problem that this works for every possible Python code - it is just confusing to get a SyntaxError when there is no syntax error in the input. BTW, I believe the parser is *not* recursive on the C stack; I could not find any sign of recursion inside Parser/parser.c. Most likely, the crash you got when increasing the parser stack size comes from Python/compile.c; the com_* functions are recursive on the C stack. It would be possible to remove the recursion here as well (using an explicit stack that lives on the heap); that would require a larger rewrite of compile.c, though. |
|
|
msg1691 - (view) |
Author: Guido van Rossum (gvanrossum) *  |
Date: 2000-10-02 10:22 |
The SyntaxError is actually caused by a bug in the parser code! When s_push() reports a stack overflow it returns -1, which is passed through unchanged by push() -- but the code that calls it only checks for positive error codes! I've fixed this by returning E_NOMEM from s_push() instead. The only downside I see of making MAXSTACK larger is that a stack of maximal size is allocated each time the parser is invoked. With MAXSTACK=500 on a 32-bit machine, that's 6K. Switch to 5000 and it's 60K. No big deal except perhaps for ports to PalmPilots and the like... Note that with MAXSTACK=500, you can "only" have 35 nested lists. |
|
|
msg1692 - (view) |
Author: Jeremy Hylton (jhylton)  |
Date: 2000-10-06 15:55 |
Guido fixed the bug part. Defer any other changes to post 2.0. Added to PEP 42. |
|
|
msg60086 - (view) |
Author: Ralf Schmitt (schmir) |
Date: 2008-01-18 11:32 |
Well, I've been a victim of this one yesterday in a real world example. I'm logging the repr of arguments to XMLRPC method calls and we happen to use nested lists, which where deep enough to overflow that stack. It's now 8 years later and I can live with the parser using 60K memory. I think this limit should be upped a bit. |
|
|
msg60087 - (view) |
Author: Ralf Schmitt (schmir) |
Date: 2008-01-18 11:33 |
Ofcouse the problem was not logging, but I wanted to "replay" those commands. This is where I got the error. |
|
|
msg60095 - (view) |
Author: Guido van Rossum (gvanrossum) *  |
Date: 2008-01-18 15:44 |
Fine, submit a patch. Might as well open a new bug for the patch (referring to this one for background). |
|
|
msg61382 - (view) |
Author: Ralf Schmitt (schmir) |
Date: 2008-01-21 10:16 |
Ok, I've upped the limit to some very high value and tried to provoke a stack overflow. eval("["*x+"]"*x) segfaults on my machine for x somewhere around 20000 (linux, amd64). When setting MAXSTACK to 5000 eval("["*x+"]"*x) works for x <= 333. So, I guess this should be safe guess (even for the BSDs, which have a smaller default stack size). BTW: The hardest part was recognizing that nothing gets rebuilt and that more then parser.o depended on parser.h. Don't you have some kind of automatic dependency tracking? Or am I missing some build tools? Anyway, patch in http://bugs.python.org/issue1881 |
|
|
msg61387 - (view) |
Author: Christian Heimes (christian.heimes) *  |
Date: 2008-01-21 13:12 |
Parser/parser.h was not in the list of dependencies. I fixed it in r60151 |
|
|