CODE: [Copy to clipboard]那么如何使用robots.txt来达到对搜索引擎蜘蛛进行限速的目的呢?
CODE: [Copy to clipboard]另外在Extended standard中,新增了Visit-time以及 Request-rate等指令,用户可以参考以下内容:Several major crawlers support a Crawl-delay parameter, set to the number of seconds to wait between successive requests to the same server:
User-agent: *
Crawl-delay: 10
CODE: [Copy to clipboard]我们推荐用户使用Crawl-delay来进行限速。An Extended Standard for Robot Exclusion has been proposed, which adds several new directives, such as Visit-time and Request-rate. For example:
User-agent: *
Disallow: /downloads/
Request-rate: 1/5 # maximum rate is one page every 5 seconds
Visit-time: 0600-0845 # only visit between 06:00 and 08:45 UTC (GMT)
CODE: [Copy to clipboard]结束语To change the crawl rate:
On the Webmaster Tools Home page, click the site you want.
Under Site configuration, click Settings.
In the Crawl rate section, select the option you want.
The new crawl rate will be valid for 90 days.
欢迎光临 网普技术论坛 (http://bbs.netpu.net/) | Powered by Discuz! 2.5 |