Robots.txt Generator

Search Engine Optimization

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robots.txt is a file that can be placed in the root folder of your website to help search engines to index your site more precisely. Search engines like Google use website crawlers, or robots that review all the content on your website.

There may be a part of your website that you don't want them to explore to be included in the user's search results, such as the admin page. You can add these pages to files that will be explicitly ignored. The Robots.txt file uses something called the Robots Exception Protocol.

You can easily create new files or edit existing robots.txt files for your site with the robots.txt generator. Use the robots.txt generator tool to create directives with Allow or Disallow directives (Allow is the default, click to change) for User Agents for the content specified on your site.