Robots.txt Generator
Create robots.txt files for your website
Select Bots
Crawl Rules
Generated robots.txt
About robots.txt
The robots.txt file tells search engine crawlers which URLs they can access on your site. It's placed at the root of your website (e.g., example.com/robots.txt).
User-agent: * applies rules to all bots.
Disallow: /path/ blocks access to a path.
Allow: /path/ explicitly allows access (useful after a broader Disallow).
Sitemap: tells bots where to find your XML sitemap.
Common Patterns
Disallow: / - Block all pages
Disallow: (empty) - Allow all pages
Disallow: /*.pdf$ - Block all PDF files
Disallow: /private/ - Block specific folder