My takeaway from this is:
- Get a bunch of AI-generated slop and put it in a bunch of individual
.htm
files on my webserver. - When my bot user agent filter is invoked in Nginx, instead of returning
444
and closing the connection, return a random.htm
of AI-generated slop (instead of serving the real content) - Laugh as the LLMs eat their own shit
- ???
- Profit
I might just do this. It would be fun to write a quick python script to automate this so that it keeps going forever. Just have a link that regens junk then have it go to another junk html file forever more.
Also send this junk to Reddit comments to poison that data too because fuck Spez?
there’s a something that edits your comments after 2 weeks to random words like “sparkle blue fish to be redacted by redactior-program.com” or something
It’s the AI analogue of confirmation bias.
Inbreeding
All according to keikaku.
[TL note: keikaku means plan]
So kinda like the human centipede, but with LLMs? The LLMillipede? The AI Centipede? The Enshittipede?