Reading Fedora infrastructure update about yet more problems with AI scrapers. It's really disgusting the pain that these bots are inflicting on the rest of the Internet. I hope users of these tools are conscious of how the vendors are acquiring their data, and taking pains to avoid the vendors or projects that ignore robots.txt (and such) and go out of their way to avoid being blocked.
These parasites are *worse* than spammers; typically spam is a nuisance, but not something that causes so much damage. It is flat-out unethical what these companies are doing, and equally unethical to support them.