[Python-Dev] Summary of Tracker Issues (original) (raw)
"Martin v. Löwis" martin at v.loewis.de
Tue May 15 07:09:44 CEST 2007
- Previous message: [Python-Dev] Summary of Tracker Issues
- Next message: [Python-Dev] Summary of Tracker Issues
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
Aahz schrieb:
On Mon, May 14, 2007, "Martin v. L?wis" wrote:
Skip(?):
In the meantime (thinking out loud here), would it be possible to keep search engines from seeing a submission or an edit until a trusted person has had a chance to approve it? It would be possible, but I would strongly oppose it. A bug tracker where postings need to be approved is just unacceptable. Could you expand this, please? It sounds like Skip is just talking about a dynamic robots.txt, essentially. Anyone coming in to the tracker itself should still see everything.
I must have misunderstood Skip then - I thought he had a scheme in mind where an editor would have approve postings before they become visible to tracker users; the tracker itself cannot distinguish between a search engine and a regular (anonymous) user.
As for a dynamically-expanding robots.txt - I think that would be difficult to implement (close to being impossible). At best, we can have robots.txt filter out entire issues, not individual messages within an issue. So if a spammer posts to an existing issue, no proper robots.txt can be written. Even for new issues: they can be added to robots.txt only after they have been created. As search engines are allowed to cache robots.txt, they might not see that it has been changed, and fetch the issue that was supposed to be blocked.
Regards, Martin
- Previous message: [Python-Dev] Summary of Tracker Issues
- Next message: [Python-Dev] Summary of Tracker Issues
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]