SEO MotionZ Forum
Why We Use Robot.txt File In Seo..? - Printable Version

+- SEO MotionZ Forum (https://seomotionz.com)
+-- Forum: Search Engine Optimization (https://seomotionz.com/forumdisplay.php?fid=7)
+--- Forum: SEO General (https://seomotionz.com/forumdisplay.php?fid=22)
+--- Thread: Why We Use Robot.txt File In Seo..? (/showthread.php?tid=5120)



Why We Use Robot.txt File In Seo..? - pihu147741 - 03-25-2019

Hello friends,

Why We Use Robot.txt File In Seo..?


RE: Why We Use Robot.txt File In Seo..? - swatijain2233 - 03-25-2019

Why the robots.txt file is important. First, let's take a look at why the robots.txt file matters in the first place. The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl.


RE: Why We Use Robot.txt File In Seo..? - websiteee - 03-25-2019

Robots.txt is a text file which use webmaster creat to instruct web robots how to crawl pages on their website. It is the part of the robots exclusion protocol (REP) that regulate how robots crawl the web access and index page content and serve that content to end users.
" Robots.txt is a text file you put on your site to tell search robots which pages you would like them not to visit."


RE: Why We Use Robot.txt File In Seo..? - jonathan brown - 03-25-2019

Hi,
 
Robot.txt allows the google crawler, which page to crawl of your site and which not to, after indexing.

For example, if you specify in your Robots.txt file that you dont want the search engines to be able to access your thank you page, that page wont be able to show up in the search results and web users wont be able to find it.

Search engines send out tiny programs called spidersor robotsto search your site and bring information back to the search engines so that the pages of your site can be indexed in the search results and found by web users. Your Robots.txt file instructs these programs not to search pages on your site which you designate using a disallowcommand. 

For example, the following Robots.txt command:
User-agent: *
Disallow: /images 
would block all search engine robots from visiting the following page on your website