Thread Rating:
  • 0 Vote(s) - 0 Average
Share Thread:
Why We Use Robot.txt File In Seo..?
#4
Hi,
 
Robot.txt allows the google crawler, which page to crawl of your site and which not to, after indexing.

For example, if you specify in your Robots.txt file that you dont want the search engines to be able to access your thank you page, that page wont be able to show up in the search results and web users wont be able to find it.

Search engines send out tiny programs called spidersor robotsto search your site and bring information back to the search engines so that the pages of your site can be indexed in the search results and found by web users. Your Robots.txt file instructs these programs not to search pages on your site which you designate using a disallowcommand. 

For example, the following Robots.txt command:
User-agent: *
Disallow: /images 
would block all search engine robots from visiting the following page on your website


Messages In This Thread
RE: Why We Use Robot.txt File In Seo..? - by jonathan brown - 03-25-2019, 08:28 AM

Forum Jump:


Users browsing this thread: 1 Guest(s)
[-]
Ad

[-]
Top Articles
100+ Social Bookmarking Sites List
55+ Image Submission Sites List

[-]
Recent Posts
https://www.facebook.com/profile.php?id=61558711963705
Fit Flare Keto ACV Gummies is a keto diet supplement which by and by gives heaps of advan...GlennOrona — 08:13 AM
Paid SEO Services
Refer some premium SEO services please. Here are top 10 SEO paid service tools 1. S...jahnvi arora — 07:59 AM
Paid SEO Services
Moz Pro ahrefs Semrush SE Ranking Spyfu Brightedge Conductor Are some paid SEO too...catherine paiz — 07:56 AM
https://github.com/trim-tummy-keto-gummies/
Trim Tummy Keto Gummies supplement Equilibrium: whilst diminishing energy is extensi...federickpaul — 07:20 AM
https://www.facebook.com/FitSpressoCoffeeLoophole/
꧁༺✨❗Shop Now ❗✨༻꧂ ꧁༺✨❗Facebook Now❗✨༻꧂ ꧁༺✨❗Facebook Now❗✨༻꧂ ꧁༺✨❗Facebook Now❗✨༻꧂ ...FitsKetoUSA — 06:57 AM

[-]
Follow us on Facebook