[rust-dev] robots.txt prevents Archive.org from storing old documentation
j.wielicki at sotecware.net
Thu Jul 10 08:49:55 PDT 2014
On 10.07.2014 16:56, Daniel Micay wrote:
> On 10/07/14 03:46 AM, Gioele Barabucci wrote:
>> the current robots.txt on docs.rust-lang.org prevents Archive.org from
>> storing copies of the old documentation. I think having the old
>> documentation archived would be a good thing. BTW, all the documentation
>> before 0.10 seems gone and this is a shame.
>> Could you please allow the Archive.org bot to index the site?
>> For the records:
>> $ curl http://doc.rust-lang.org/robots.txt
>> User-agent: *
>> Disallow: /0.3/
>> Disallow: /0.4/
>> Disallow: /0.5/
>> Disallow: /0.6/
>> Disallow: /0.7/
>> Disallow: /0.8/
>> Disallow: /0.9/
>> Disallow: /0.10/
> The old documentation is all available from the Git repository. The
> robots.txt rule is there to reverse the trend of searches being filled
> with out of date documentation.
While this is a good thing /all/ software projects should be doing imo,
one could still explicitly allow Archive.org by prepending:
> Rust-dev mailing list
> Rust-dev at mozilla.org
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 819 bytes
Desc: OpenPGP digital signature
More information about the Rust-dev