SEO MotionZ Forum

Full Version: Is robots.txt file enough to block some pages from Search Engines?
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Is robots.txt file enough to block some pages from Search Engines? I mean from crawling and indexing.
No its not enough. Its best that you block them through .htaccess file.
(10-24-2019, 05:52 AM)lammao Wrote: [ -> ]Is robots.txt file enough to block some pages from Search Engines? I mean from crawling and indexing.

Robots.text file only tells search engines or suggest some might say that it can or cannot crawl & index any page. It doesn't necessarily blocks them. .htaccess file is the best option for you as suggested.
Can anyone provide me the code for htaccess file?
I hope the article helped you.
If it is linked to from other websites, a page that is disallowed in robots.txt can still be indexed. While Google won't slither or record the substance hindered by a robots.txt document, we could in any case find and file a prohibited URL in the event that it is connected from different puts on the web.