Thread Rating:
  • 0 Vote(s) - 0 Average
Share Thread:
How de-index low quality pages from Google?
#6
Robots.txt is a basic text file which works when uploaded to the root directory of website. Its the best way to tell all or any search engine to not crawl and index pages/directories or any other resource on the website. If any page/directory is written against Disallow syntax then it will not be crawled by that search engine.

Ex-

User-agent: *
Disallow: /
Reply


Messages In This Thread

Forum Jump:


Users browsing this thread: 1 Guest(s)