Robots.txt Generator

Search Engine Optimization

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robots .txt contain instructions on a file on your websites crawling. The other name known for this is robots exclusion protocol and will be used by all the sites why pure marketing is best. There will be a couple of options, and it is easy to make. The guidelines required in this file are, and the file can be modified much later.

1. Delay in crawl - When requests are many, the server will get overloaded in different ways, and this is used to crawler preventing from host overloading.
2. Allowing - Any no of urls can be added and the larger the list becomes. The robots file can be used if the site contains pages that the indexing does not take place.