A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://mail.python.org/pipermail/python-dev/2010-May/100095.html below:

[Python-Dev] robots exclusion file on the buildbot pages?

[Python-Dev] robots exclusion file on the buildbot pages?"Martin v. Löwis" martin at v.loewis.de
Sat May 15 21:49:07 CEST 2010
> The buildbots are sometimes subject to a flood of "svn exception"
> errors. It has been conjectured that these errors are caused by Web
> crawlers pressing "force build" buttons without filling any of the
> fields (of course, the fact that we get such ugly errors in the
> buildbot results, rather than a clean error message when pressing
> the button, is a buildbot bug in itself). Couldn't we simply exclude all
> crawlers from the buildbot Web pages?

Hmm. Before doing any modifications, I'd rather have a definite analysis
on this. Are you absolutely certain that, when that happened, the
individual builds that caused this svn exception where actually
triggered over the web, rather than by checkin?

When it happens next, please report exact date and time, and the build
log URL. Due to log rotation, it would then be necessary to investigate
that in a timely manner.

Without any reference to the specific case, I'd guess that a flood of
svn exceptions is caused due to an svn outage, which in turn might be
caused when a build is triggered while the daily Apache restart happens
(i.e. around 6:30 UTC+2).

That said: /dev/buildbot has been disallowed for all robots for quite
some time now:

http://www.python.org/robots.txt

There is really no point robots crawling the build logs, as they don't
contain much useful information for a search engine.

Regards,
Martin
More information about the Python-Dev mailing list

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4