Merge pull request #44 from cblanc/robots

Prevent crawls of large data files
This commit is contained in:
simplefl 2015-08-23 21:34:23 +02:00
commit 2e0bf01460

View file

@ -1,5 +1,7 @@
# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file # See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
# #
# To ban all spiders from the entire site uncomment the next two lines: # To ban all spiders from the entire site uncomment the next two lines:
# User-Agent: * User-Agent: *
# Disallow: / Disallow: /data_files
Disallow: /directories
Disallow: /movies