03-25-2019, 08:10 AM
Robots.txt is a text file which use webmaster creat to instruct web robots how to crawl pages on their website. It is the part of the robots exclusion protocol (REP) that regulate how robots crawl the web access and index page content and serve that content to end users.
" Robots.txt is a text file you put on your site to tell search robots which pages you would like them not to visit."
" Robots.txt is a text file you put on your site to tell search robots which pages you would like them not to visit."