Best Robots txt Generator

Free Web Worth SEO Toolkit

Robots.txt Generator

Default - All Robots are:  
Sitemap: (leave blank if you don't have) 
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo MM
  Yahoo Blogs
  DMOZ Checker
  MSN PicSearch
Restricted Directories: The path is relative to root and must contain a trailing slash "/"

Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.

About Robots.txt Generator

The robots txt Generator helps you generate a robots.txt file for your site. This file tells search engines what you want them to crawl and index. The generator will make sure your site is search engine friendly and search engine friendly.

Bots are crawling the web at an ever-increasing pace. However, not all bots are created equal. Some bots crawl the web fulfilling their mission and some bots crawl the web causing damage. This blog will take a look at how you can protect your site from bots and what you can do to protect it from bots that are out to cause damage.


Make a file called robots txt.

A robots.txt file may be created using nearly any text editor. Notepad, TextEdit, vi, and emacs, for example, can all produce legitimate robots.txt files. Use a spreadsheet instead of a word processor; word processors store files in a proprietary format and sometimes contain unusual characters like curly quotes, which might cause crawlers difficulties. If requested during the save file window, save the file using UTF-8 encoding.

Rules for format and location:

The file's name must be robots.txt.
There can only be one robots.txt file on your website.
The robots.txt file must be placed in the root directory of the website to which it applies. For example, the robots.txt file must be hosted at to govern crawling on all URLs below It can't be stored in a subfolder (like Contact your web hosting service provider if you're unclear how to access your website's root directory or if you require rights to do so. If you can't get to the root of your website, use a different blocking strategy, such as meta tags.

A robots.txt file can be used on subdomains (such as or non-standard ports (such as
A robots.txt file must be a text file with UTF-8 encoding (which includes ASCII). Characters that aren't part of the UTF-8 range may be ignored by Google, rendering robots.txt rules invalid.