12-04-2017, 11:17 AM
Robots.txt : It is a text file created by webmasters that instruct web robots how to crawl pages in their website.
The main purpose of this file is to indicate whether the pages of their website should be crawled or not. The crawl instructions are “allowing” or “disallowing” which are specified by user agents.
The main purpose of this file is to indicate whether the pages of their website should be crawled or not. The crawl instructions are “allowing” or “disallowing” which are specified by user agents.