Skip to content

mattlinares/ai.robots.txt

 
 

Repository files navigation

AI robots.txt

This is an open list of web crawlers associated with AI companies and the training of LLMs to block. We encourage you to contribute to and implement this list on your own site.

A number of these crawlers have been sourced from Dark Visitors and we appreciate the ongoing effort they put in to track these crawlers.

If you'd like to add information about a crawler to the list, please make a pull request with the bot name added to robots.txt, ai.txt, and any relevant details in table-of-bot-metrics.md to help people understand what's crawling.


Additional resources

Spawning.ai
Create an ai.txt: an additional avenue to block crawlers. Example file:

# Spawning AI
# Prevent datasets from using the following file types

User-Agent: *
Disallow: /
Disallow: *

Have I Been Trained?
Search datasets for your content and request its removal.


Thank you to Glyn for pushing me to set this up after I posted about blocking these crawlers.

Releases

No releases published

Packages

No packages published