[tahoe-dev] Modifying the robots.txt file on allmydata.org
David-Sarah Hopwood
david-sarah at jacaranda.org
Wed Feb 24 00:50:06 PST 2010
Peter Secor wrote:
> Hi everyone (sorry for the slightly operational message),
>
> There is currently a robots.txt[1] file which blocks crawlers from a
> few of the projects on the site, specifically everything under /trac.
Allowing crawlers to index some of the dynamically generated pages under
/trac could cause horrible breakage, given darcs+trac's performance
problems. You'd have to look at what subsets of that are sufficiently
static.
--
David-Sarah Hopwood ⚥ http://davidsarah.livejournal.com
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 292 bytes
Desc: OpenPGP digital signature
Url : http://allmydata.org/pipermail/tahoe-dev/attachments/20100224/9339f394/attachment.pgp
More information about the tahoe-dev
mailing list