Warn crawlers not to touch the tripwire
OpenDev’s static content sites implement web application firewall rules designed to catch and block malicious crawlers by instructing them not to index a particular non-existent file and then blocking any client that tries to reach it. Add a corresponding robots.txt directive to the docs.openstack.org site similar to the one used in https://docs.opendev.org/robots.txt already.
Change-Id: I73c5e8c027e559ad731636560277463c3f87d7bd Signed-off-by: Jeremy Stanley fungi@yuggoth.org
版权所有:中国计算机学会技术支持:开源发展技术委员会
京ICP备13000930号-9
京公网安备 11010802032778号