Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

What is Robots.txt Generator?

The Wiztools robots.txt generator tool is a tool that will help many webmasters in making their websites Googlebot friendly. It is a free tool that generates a robot.txt file for your website.
Robots.txt generator generates a file which is different from sitemaps as it indicates which web pages to be included, Therefore robots.txt syntax is of great importance for a website. Whenever a search engine bot or crawler crawls any website, It usually first tries to look for the robots.txt file that is usually located in the root directory. When found the crawler will read the file and then will find the files and directories that need to be blocked.

Why should one use our robots.txt File Generator tool?

The very first reason would be that this useful tool makes the lives of many webmasters easier by helping them make their website Googlebot friendly. This online robots.txt generator tool generates the file required by the user by performing the difficult task in no time and that too for free of cost. This tool comes with a user-friendly design that allows the user different options to include or to exclude the things in their robots.txt file.

How to use our robots.txt Generator tool?

To use this free online tool follow the few simple steps to generate your first robots.txt file easily:

  1. By default, all the search engine robots are allowed access to your website files. However, you can choose which robot you want to block and which robot can access the files.
  2. You can choose crawl delay options which will the crawler to take a sleep time at different intervals. Normally the best delay duration is said to be 5 seconds to 120 seconds. By default, it is always set to "no delay".
  3. If you have an already made sitemap for your website you can just page the link to it in the text box. If not then just leave it blank and proceed to next step
  4. All the available search engine robots are given, You can select the one which you want to give access to crawl your site and select the one which you want to refuse access.
  5. The final step is to restrict the files and directories. The path for the directory should contain a slash "/"
  6. When everything is done you can click on the generate button. Once clicked it will generate your unique robots.txt file which you can upload it to your website root directory


Please Also Check Our Latest Tools