Configure Rules
Set up your robots.txt directives
User Agent #1
Select a user agent or use * for all crawlers
Delay between successive requests
Rules
Sitemaps
Generated robots.txt
User-agent: * Disallow:
Lines:2
Characters:23
User Agents:1
π‘ Tips & Best Practices
- Use
*to apply rules to all crawlers - Empty
Disallow:allows access to entire site - Use
Disallow: /to block entire site - Place robots.txt in your website's root directory
- Test your robots.txt with Google Search Console
- Include your XML sitemap for better crawling
