

Welcome to OpenRobotsTXT
OpenRobotsTXT is an open archive of the world’s robots.txt files. By visiting domains and caching these files over time, we track how they change and which user agents access them. Our goal is to provide valuable insights, tools, and reports for webmasters, researchers and the wider internet community for open, public study.
Here is what you can do.
- View Stats for the user-agents (bots) we’ve detected in our analysis.
- Lookup data on individual bots via the bots search. Validate your user-agent declarations against web consensus.
- Look at your robots.txt as bots do. Great for making sure your server setup delivers the robots.txt file as you intend.
If you want to learn more about the OpenRobotsTXT project, visit the about page.
Stats
Hostnames Scanned
654.5m
So far
User-Agents
53,897
Found in files
Last Updated
5 mins ago
01 Sept 2025, 17:45 (UTC)
AI Content-Usage Adoption
OpenRobotsTXT now reports on proposal by Illyes (Google) & Thomson (Mozilla) for addition of AI controls within robots.txt
NEW!
507
Root Domains
1,352
Hostnames
Top Explicitly Allow-All Bots
User-Agent | Mentions | Allow All | |
1 | Googlebot | 21.3m | 1.4m |
2 | facebookexternalhit | 1.3m | 0.5m |
3 | bingbot | 15.7m | 0.5m |