Robots.txt Generator
Create a robots.txt file to control how search engines crawl your website
Standard
Allow all crawling
E-commerce
Optimized for stores
Blog
Content-focused
WordPress
WP optimized
Restrictive
Block most bots
Custom
Start from scratch
Configuration
Website Settings
Your website's base URL for the sitemap reference
Sitemap
Include Sitemap
Add sitemap URL to robots.txt
Common Blocks
Block Admin Areas
/admin/, /wp-admin/, /dashboard/
Block Login Pages
/login/, /signin/, /register/
Block Cart/Checkout
/cart/, /checkout/, /basket/
Block Search Results
/search/, ?s=, ?q=
Block API Endpoints
/api/, /rest/, /graphql/
Block Private/Thank You
/private/, /thank-you/, /confirmation/
User Agent Rules
All Bots (*)
Crawl Delay
Enable Crawl Delay
Slow down bot crawling
Additional Disallowed Paths
One path per line
Generated robots.txt
# robots.txt
Best Practices
- Place robots.txt in your root directory (e.g., example.com/robots.txt)
- robots.txt is publicly accessible - don't list sensitive paths
- Use Disallow: / to block all crawling (for staging sites)
- Always include a sitemap reference for better indexing
- Test your robots.txt in Google Search Console