Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

FREE Robots.txt Generator

An instruction file called Robots.txt describes how a website should be crawled. Websites use this standard to tell search engines which parts of their website need indexing. It is also known as robots exclusion protocol. There is also an option to specify which areas of your site should not be crawled; such areas contain duplicate content or are under construction. There is a substantial probability that bots such as malware detectors and email harvesters won't follow this standard and will scan your site for weaknesses in security, and they will begin examining it from places you don't want indexed.

It includes the directive User-agent, below which other directives like "Allow," "Disallow," and "Crawl-Delay" can be written. If written manually, it might take a lot of time, and you can enter multiple lines of commands in one file. The Disallow and Allow attributes must be used in the same manner if you want the bots to ignore a page. One wrong line in your robots.txt file can exclude your page from indexation. So, don't assume it is easy. Therefore, leave the file creation to our Robots.txt generators and have no worries about it.