Joined: Feb 2019
Posts: 14
Likes Received: 3
Hello friends,
Why We Use Robot.txt File In Seo..?
Joined: Apr 2015
Posts: 9
Likes Received: 0
Why the robots.txt file is important. First, let's take a look at why the robots.txt file matters in the first place. The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl.
Joined: Jul 2017
Posts: 134
Likes Received: 5
Robots.txt is a text file which use webmaster creat to instruct web robots how to crawl pages on their website. It is the part of the robots exclusion protocol (REP) that regulate how robots crawl the web access and index page content and serve that content to end users.
" Robots.txt is a text file you put on your site to tell search robots which pages you would like them not to visit."
Joined: Nov 2018
Posts: 98
Likes Received: 2
03-25-2019, 08:28 AM
(This post was last modified: 03-25-2019, 12:05 PM by motionz.)
Hi,
Robot.txt allows the google crawler, which page to crawl of your site and which not to, after indexing.
For example, if you specify in your Robots.txt file that you don’t want the search engines to be able to access your thank you page, that page won’t be able to show up in the search results and web users won’t be able to find it.
Search engines send out tiny programs called “spiders” or “robots” to search your site and bring information back to the search engines so that the pages of your site can be indexed in the search results and found by web users. Your Robots.txt file instructs these programs not to search pages on your site which you designate using a “disallow” command.
For example, the following Robots.txt command:
User-agent: *
Disallow: /images
would block all search engine robots from visiting the following page on your website