Prevent crawls of large data files

This commit is contained in:
Chris Blanchard 2015-08-20 13:56:07 +01:00
parent 8285f15587
commit 6312e177f5

View file

@ -1,5 +1,7 @@
# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
#
# To ban all spiders from the entire site uncomment the next two lines:
# User-Agent: *
# Disallow: /
User-Agent: *
Disallow: /data_files
Disallow: /directories
Disallow: /movies