SEO MotionZ Forum

Full Version: Robots code
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Why robots.txt code used in sitemap creation?
The robots.txt files are used to tell the Search Engine bots which part of your site to crawl an index (or not). These are not antivirus but they just stop the Search spiders if you want any part of your site to not to make public in the search engines.
Robot.txt file is one of the most important features of the On-Page search engine optimization. It is used to search engine which Page you do not Want Crawled or Indexed.
we need to know robots code

user-agent:* by using this code user request google to index the pages of the website
Disallow:/ by using this code user says to google not to index on the particular page which the page is on private that user not interested make it live so
Using robots.txt and sitemaps to get your web pages indexed by search ... This can be used to tell search engines or other robots where your sitemap is located. ... them using a sitemap generation tool rather than trying to hand code them.
robot.txt is a text file which allow search engines to which page, directory, domain have to index or not.. I will submit it to in Google Webmaster tools or in the root directory of the domain or subdomain which I am promoting.