Free SEO Tool
Robots.txt Tester & Validator
Test and validate your robots.txt configuration. See exactly how different search engine crawlers interpret your directives.
1
Fetch from Domain
or paste below ↓2
Edit or Paste Robots.txt Content
About Robots.txt
The robots.txt file tells search engine crawlers which URLs they can access on your site. It's primarily used to prevent your site from being overloaded with requests.
Common Directives:
- User-agent: Specifies which crawler
- Disallow: Blocks specific paths
- Allow: Overrides disallow rules
- Crawl-delay: Seconds between requests