[tahoe-dev] [tahoe-lafs] #823: WUI server should have a disallow-all robots.txt
tahoe-lafs
trac at tahoe-lafs.org
Tue Dec 28 16:00:24 UTC 2010
#823: WUI server should have a disallow-all robots.txt
-----------------------------------+----------------------------------------
Reporter: davidsarah | Owner:
Type: defect | Status: new
Priority: major | Milestone: undecided
Component: code-frontend-web | Version: 1.5.0
Resolution: | Keywords: privacy
Launchpad Bug: |
-----------------------------------+----------------------------------------
Comment (by zooko):
I disagree with "WUI server should have a disallow-all robots.txt". I
think if a web crawler gets access to a cap then it ''should'' crawl and
index all the files and directories reachable from that cap. I suppose you
can put a {{{robots.txt}}} file into a directory in Tahoe-LAFS if you want
crawlers to ignore that directory.
--
Ticket URL: <http://tahoe-lafs.org/trac/tahoe-lafs/ticket/823#comment:6>
tahoe-lafs <http://tahoe-lafs.org>
secure decentralized storage
More information about the tahoe-dev
mailing list