What is robots.txt
Robots.txt is text file created by webmasters to instruct web robots how to crawl pages of their website. This file placed at the root of the site that indicates those parts of your site you don’t want accessed by search engine crawlers. The file uses the RES ( Robots Exclusion Standard ) protocol.
User-agent : [agent-name] Disallow : [string not to be crawled]