Seeing as my personal web server is constantly overloaded by AI crawlers, I'm now spinning up a honeypot based on https://rnsaffn.com/poison3/ as a supplementary service to the current infinite maze of garbage data. Plan is to pull several gigs of poisoned text from that poisoned well and serve as static files without any bandwidth restrictions. Let them eat cake.
For extra fun, I'm backdating all the files to the pre-LLM era to make them extra attractive.
It'll take several days to fill the honeypot, but that's okay, that's something that can run in the background.