🤖 Robots.txt Generator
Create custom robots.txt files to control how search engines crawl your website
📋 Quick Templates
Allow All
Allow all bots to crawl entire site
Block All
Block all bots from crawling
Standard Site
Common rules for most websites
E-commerce
Rules for online stores
⚙️ Configure Rules
Rule 1
User-agent:
Directives:
Allow
Disallow
×
+ Add Directive
Crawl-delay (seconds):
+ Add Rule
Sitemap URLs (one per line):
Additional Comments:
🚀 Generate Robots.txt
📄 Generated Robots.txt
# Your robots.txt will appear here # Fill the form and click Generate
📋 Copy
💾 Download
✅ Validate
📚 Robots.txt Guide
Common Directives
User-agent:
Specifies which bot the rules apply to
Disallow:
Blocks access to specific paths
Allow:
Explicitly allows access to paths
Crawl-delay:
Sets delay between requests
Sitemap:
Points to your XML sitemap
Best Practices
Place robots.txt in your website root directory
Use specific paths instead of wildcards when possible
Test your robots.txt with Google Search Console
Include your sitemap URL
Keep it simple and readable
Common User-Agents
*
- All web crawlers
Googlebot
- Google's web crawler
Bingbot
- Microsoft Bing crawler
Slurp
- Yahoo's web crawler
facebookexternalhit
- Facebook crawler