

Welcome to OpenRobotsTXT
OpenRobotsTXT is an open archive of the world’s robots.txt files. By visiting domains and caching these files over time, we track how they change and which user agents access them. Our goal is to provide valuable insights, tools, and reports for webmasters, researchers and the wider internet community for open, public study.
Here is what you can do.
- View Stats for the user-agents (bots) we’ve detected in our analysis.
- Lookup data on individual bots via the bots search. Validate your user-agent declarations against web consensus.
- Look at your robots.txt as bots do. Great for making sure your server setup delivers the robots.txt file as you intend.
If you want to learn more about the OpenRobotsTXT project, visit the about page.
Stats
Hostnames Scanned
659.2m
So far
User-Agents
54,533
Found in files
Last Updated
12 mins ago
17 Oct 2025, 19:12 (UTC)
AI Content-Usage Adoption
OpenRobotsTXT now reports on proposal by Illyes (Google) & Thomson (Mozilla) for addition of AI controls within robots.txt
NEW!
743
Root Domains
1,859
Hostnames
Top Explicitly Allow-All Bots
User-Agent | Mentions | Allow All | |
1 | Googlebot | 20.4m | 1.5m |
2 | facebookexternalhit | 1.4m | 0.6m |
3 | bingbot | 17.4m | 0.5m |