Thread Rating:
  • 0 Vote(s) - 0 Average
Share Thread:
Why We Use Robot.txt File In Seo..?
#4
Hi,
 
Robot.txt allows the google crawler, which page to crawl of your site and which not to, after indexing.

For example, if you specify in your Robots.txt file that you dont want the search engines to be able to access your thank you page, that page wont be able to show up in the search results and web users wont be able to find it.

Search engines send out tiny programs called spidersor robotsto search your site and bring information back to the search engines so that the pages of your site can be indexed in the search results and found by web users. Your Robots.txt file instructs these programs not to search pages on your site which you designate using a disallowcommand. 

For example, the following Robots.txt command:
User-agent: *
Disallow: /images 
would block all search engine robots from visiting the following page on your website


Messages In This Thread
RE: Why We Use Robot.txt File In Seo..? - by jonathan brown - 03-25-2019, 08:28 AM

Forum Jump:


Users browsing this thread: 1 Guest(s)