09-07-2018, 06:43 AM
Robots.txt is a basic text file which works when uploaded to the root directory of website. Its the best way to tell all or any search engine to not crawl and index pages/directories or any other resource on the website. If any page/directory is written against Disallow syntax then it will not be crawled by that search engine.
Ex-
User-agent: *
Disallow: /
Ex-
User-agent: *
Disallow: /