[Python-Dev] robots exclusion file on the buildbot pages? (original) (raw)

"Martin v. Löwis" martin at v.loewis.de
Sat May 15 21:49:07 CEST 2010


The buildbots are sometimes subject to a flood of "svn exception" errors. It has been conjectured that these errors are caused by Web crawlers pressing "force build" buttons without filling any of the fields (of course, the fact that we get such ugly errors in the buildbot results, rather than a clean error message when pressing the button, is a buildbot bug in itself). Couldn't we simply exclude all crawlers from the buildbot Web pages?

Hmm. Before doing any modifications, I'd rather have a definite analysis on this. Are you absolutely certain that, when that happened, the individual builds that caused this svn exception where actually triggered over the web, rather than by checkin?

When it happens next, please report exact date and time, and the build log URL. Due to log rotation, it would then be necessary to investigate that in a timely manner.

Without any reference to the specific case, I'd guess that a flood of svn exceptions is caused due to an svn outage, which in turn might be caused when a build is triggered while the daily Apache restart happens (i.e. around 6:30 UTC+2).

That said: /dev/buildbot has been disallowed for all robots for quite some time now:

http://www.python.org/robots.txt

There is really no point robots crawling the build logs, as they don't contain much useful information for a search engine.

Regards, Martin



More information about the Python-Dev mailing list