Robots.txt Generator

Docs

Generate a custom robots.txt file to control how search engines crawl your website. Specify which pages to allow or disallow for different user agents.

Your website's domain (used for sitemap URL)
User Agent Rules
Optional: Delay between requests (mainly for non-Google bots)
Quick Templates
Robots.txt Guide
Common Directives:
  • User-agent: Specifies which crawler
  • Disallow: Blocks access to paths
  • Allow: Permits access (overrides Disallow)
  • Sitemap: Points to XML sitemap
  • Crawl-delay: Sets request frequency
Common Patterns:
/admin/ - Block admin areas
/wp-admin/ - Block WordPress admin
/*.pdf$ - Block PDF files
/search? - Block search pages
/cart/ - Block shopping cart
⚠️ Important: Robots.txt is publicly accessible. Don't use it to hide sensitive content - use proper authentication instead.
Validation

After uploading your robots.txt file, test it using: