Thread Rating:
  • 0 Vote(s) - 0 Average
Share Thread:
Robots code
#1
Why robots.txt code used in sitemap creation?
Reply
#2
The robots.txt files are used to tell the Search Engine bots which part of your site to crawl an index (or not). These are not antivirus but they just stop the Search spiders if you want any part of your site to not to make public in the search engines.
Reply
#3
Robot.txt file is one of the most important features of the On-Page search engine optimization. It is used to search engine which Page you do not Want Crawled or Indexed.
Reply
#4
we need to know robots code

user-agent:* by using this code user request google to index the pages of the website
Disallow:/ by using this code user says to google not to index on the particular page which the page is on private that user not interested make it live so
Reply
#5
Using robots.txt and sitemaps to get your web pages indexed by search ... This can be used to tell search engines or other robots where your sitemap is located. ... them using a sitemap generation tool rather than trying to hand code them.
Reply
#6
robot.txt is a text file which allow search engines to which page, directory, domain have to index or not.. I will submit it to in Google Webmaster tools or in the root directory of the domain or subdomain which I am promoting.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)
[-]
Ad

[-]
Top Articles
100+ Social Bookmarking Sites List
55+ Image Submission Sites List

[-]
Recent Posts
Test
what are you guys doing?diamondhead — 01:30 PM
Test
Yes, text box remain populated. And also when added to an ex9isting post the spinner d...anush — 01:26 PM
Test
This is a new reply. This is an added quickreply. Spinner did not show after adding qui...mybb_tester — 11:26 AM
Test
A new tester. Edited to add unique searchable term. I note that the spinner appears as r...mybb_tester — 10:16 AM
Test
Posteddiamondhead — 08:15 AM

[-]
Follow us on Facebook