[Python-Dev] Community buildbots (was Re: User's complaints) (original) (raw)
Tim Peters tim.peters at gmail.com
Sun Jul 16 05:47:50 CEST 2006
- Previous message: [Python-Dev] Community buildbots (was Re: User's complaints)
- Next message: [Python-Dev] Community buildbots (was Re: User's complaints)
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
[Neal Norwitz]
... That leaves 1 unexplained failure on a Windows bot.
It wasn't my Windows bot, but I believe test_profile has failed (rarely) on several of the bots, and in the same (or very similar) way. Note that the failure went away on the Windows bot in question the next time the tests were run on it. That's typical of test_profile failures.
Unfortunately, because test_profile works by comparing stdout against a canned expected-output file under Lib/test/output/, when we re-run the test in verbose mode at the end, that comparison isn't done, so there's isn't a simple way to know whether the test passed or failed during its second run. I bet it actually passed, but don't know.
This problem is obscured because when you run regrtest.py "by hand" with -v, you get this message at the end:
""" CAUTION: stdout isn't compared in verbose mode: a test that passes in verbose mode may fail without it. """
which is intended to alert you to the possible problem.
But when we re-run failing tests in verbose mode "by magic" (via passing -w), that warning isn't produced.
BTW, the best solution to this is to convert all the tests away from regrtest's original "compare stdout against a canned output/TEST_NAME file" scheme. That won't fix test_profile, but would make it less of a puzzle to figure out what's wrong with it.
- Previous message: [Python-Dev] Community buildbots (was Re: User's complaints)
- Next message: [Python-Dev] Community buildbots (was Re: User's complaints)
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]