Robots.txt Generator

Fill in the provided area with required information to generate Robots.txt.

Default - All Robots are:

Crawl-Delay:

Sitemap: Fill in the provided area with required information to generate Robots.txt.  

Google

Google Image

Google Mobile

MSN Search

Yahoo

Yahoo MM

Yahoo Blogs

Ask/Teoma

GigaBlast

DMOZ Checker

Nutch

Alexa/Wayback

Baidu

Naver

MSN PicSearch

Restricted Directories:

The path is relative to root and must contain a trailing slash"/"



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robots.txt is a text file webmasters create to instruct web robots (typically search engine robots) how to crawl pages on their website.

The robots.txt file is part of the robots exclusion protocol (REP), a group of web standards that regulate how robots crawl the web, access, and index content, and serve that content up to users.

The REP also includes directives like meta robots, as well as a page-, subdirectory-, or site-wide instructions for how search engines should treat links (such as “follow” or “nofollow.

WideOver allows you to build a robots.txt file for free. The tool offers a strong interface to create a robots.txt file for free of cost.  

You can choose a crawl-delay period and which bots are permitted or prohibited from crawling your site.