Joined: Dec 2014
Posts: 11
Likes Received: 0
Why robots.txt code used in sitemap creation?
Joined: Sep 2013
Posts: 998
Likes Received: 56
The robots.txt files are used to tell the Search Engine bots which part of your site to crawl an index (or not). These are not antivirus but they just stop the Search spiders if you want any part of your site to not to make public in the search engines.
Joined: Mar 2016
Posts: 68
Likes Received: 0
Robot.txt file is one of the most important features of the On-Page search engine optimization. It is used to search engine which Page you do not Want Crawled or Indexed.
Joined: Mar 2016
Posts: 10
Likes Received: 0
we need to know robots code
user-agent:* by using this code user request google to index the pages of the website
Disallow:/ by using this code user says to google not to index on the particular page which the page is on private that user not interested make it live so
Joined: Nov 2013
Posts: 557
Likes Received: 0
Using robots.txt and sitemaps to get your web pages indexed by search ... This can be used to tell search engines or other robots where your sitemap is located. ... them using a sitemap generation tool rather than trying to hand code them.
Joined: Jun 2016
Posts: 41
Likes Received: 0
robot.txt is a text file which allow search engines to which page, directory, domain have to index or not.. I will submit it to in Google Webmaster tools or in the root directory of the domain or subdomain which I am promoting.