Merge pull request #44 from cblanc/robots

Prevent crawls of large data files
This commit is contained in:
simplefl 2015-08-23 21:34:23 +02:00
commit e6a8f00c30

View file

@ -1,5 +1,7 @@
# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
#
# To ban all spiders from the entire site uncomment the next two lines:
# User-Agent: *
# Disallow: /
User-Agent: *
Disallow: /data_files
Disallow: /directories
Disallow: /movies