SEO MotionZ Forum

Full Version: Why We Use Robot.txt File In Seo..?
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hello friends,

Why We Use Robot.txt File In Seo..?
Why the robots.txt file is important. First, let's take a look at why the robots.txt file matters in the first place. The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl.
Robots.txt is a text file which use webmaster creat to instruct web robots how to crawl pages on their website. It is the part of the robots exclusion protocol (REP) that regulate how robots crawl the web access and index page content and serve that content to end users.
" Robots.txt is a text file you put on your site to tell search robots which pages you would like them not to visit."
Hi,
 
Robot.txt allows the google crawler, which page to crawl of your site and which not to, after indexing.

For example, if you specify in your Robots.txt file that you dont want the search engines to be able to access your thank you page, that page wont be able to show up in the search results and web users wont be able to find it.

Search engines send out tiny programs called spidersor robotsto search your site and bring information back to the search engines so that the pages of your site can be indexed in the search results and found by web users. Your Robots.txt file instructs these programs not to search pages on your site which you designate using a disallowcommand. 

For example, the following Robots.txt command:
User-agent: *
Disallow: /images 
would block all search engine robots from visiting the following page on your website