Create robots.txt files to control search engine crawling
Configure rules and generate robots.txt
Prevent crawling of admin and private sections
Set delays to reduce server load from crawlers
Tell crawlers where to find your XML sitemap
Different rules for different search engines