03-08-2018, 04:30 AM
Robots.txt is a text file created by webmasters that instruct web robots how to crawl pages in their website. This file also indicates whether the pages of their website should be crawled or not. The crawl instructions are “allowing” or “disallowing” which are specified by user agents.