Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robot.txt Generator at “Pool of Tools”

To create proper Robots.txt directives for your site, you have to use Robots.txt Generator. The reason being that it easily copies txt files from other sites or creates your own file. "Pool of Tools" provides the best SEO tools. All the tools are free of cost.

Why is Robot.txt Generator beneficial?

  • It is used to upload an existing file. Just type or copy/paste in the root domain’s URL into the provided text box and click ‘Upload’.
  • It is used to customize the generated robots.txt file. Just use the ‘Allow’ and ‘Disallow’ functions. The default setting is “Allow”.

What does Robot.txt Generator do?

  • It specifies alternate directives for specific search engine crawler and specifies the bot.
  • In order to add the custom section to the list with the generic directive as part of the new custom directive, Select ‘Add Directive’.