Free Robots.txt Generator

Create a search engine friendly robots.txt file to control how search engines crawl your website.

Leave empty to auto-generate: yoursite.com/sitemap.xml

Time to wait between crawl requests (useful for slow servers)

Generated Robots.txt

Installation Instructions

  1. Copy the generated robots.txt content or download the file
  2. Upload the robots.txt file to your website's root directory
  3. Test at: yoursite.com/robots.txt
  4. Submit to Google Search Console for faster indexing

Common Use Cases

  • • Block admin and private directories (/admin/, /private/)
  • • Prevent indexing of duplicate content (/print/, /mobile/)
  • • Control crawl rate for large sites
  • • Guide bots to your sitemap
  • • Block specific file types (/wp-admin/, /*.pdf)

Validation Status

Valid Syntax
Sitemap Included
User-agent Specified

Understanding Robots.txt Files

A robots.txt file is a text file that tells search engine crawlers which pages or files they can or can't request from your site. It's placed in the root directory of your website and follows the Robots Exclusion Protocol.

Basic Robots.txt Syntax

  • User-agent: Specifies which crawler the rule applies to
  • Disallow: Tells crawlers not to access specific paths
  • Allow: Explicitly allows access to specific paths
  • Sitemap: Points crawlers to your XML sitemap
  • Crawl-delay: Sets delay between requests

Best Practices

  • Keep your robots.txt file simple and readable
  • Use specific paths rather than wildcards when possible
  • Include your sitemap URL for better crawling
  • Test your robots.txt file using Google Search Console
  • Don't use robots.txt for sensitive content security

Common Examples

Block specific directories:

User-agent: *
Disallow: /admin/
Disallow: /private/
Disallow: /temp/

Allow all with sitemap:

User-agent: *
Allow: /

Sitemap: https://example.com/sitemap.xml

Related SEO Resources