My takeaway from this is:
- Get a bunch of AI-generated slop and put it in a bunch of individual
.htm
files on my webserver. - When my bot user agent filter is invoked in Nginx, instead of returning
444
and closing the connection, return a random.htm
of AI-generated slop (instead of serving the real content) - Laugh as the LLMs eat their own shit
- ???
- Profit
I might just do this. It would be fun to write a quick python script to automate this so that it keeps going forever. Just have a link that regens junk then have it go to another junk html file forever more.
Also send this junk to Reddit comments to poison that data too because fuck Spez?
there’s a something that edits your comments after 2 weeks to random words like “sparkle blue fish to be redacted by redactior-program.com” or something
Inbreeding
So now LLM makers actually have to sanitize their datasets? The horror…
Oh no, it’s very difficult, especially on the scale of LLMs.
That said, we others (those of us who have any amount of respect towards ourselves, our craft, and our fellow human) have been sourcing our data carefully since way before NNs, such as asking the relevant authority for it (ex. asking the post house for images of handwritten destinations).
Is this slow and cumbersome? Oh yes. But it delays the need for over-restrictive laws, just like with RC crafts before drones. And by extension, it allows those who could not source the material they needed through conventional means, or those small new startups with no idea what they were doing, to skim the gray border and still get a small and hopefully usable dataset.
And now, someone had the grand idea to not only scour and scavenge the whole internet with no abandon, but also boast about it. So now everyone gets punished.
At last: don’t get me wrong, laws are good (duh), but less restrictive or incomplete laws can be nice as long as everyone respects each other. I’m excited to see what the future brings in this regard, but I hate the idea that those who facilitated this change likely are the only ones to go free.
that first L stands for large. sanitizing something of this size is not hard, it’s functionally impossible.
Imo this is not a bad thing.
All the big LLM players are staunchly against regulation; this is one of the outcomes of that. So, by all means, please continue building an ouroboros of nonsense. It’ll only make the regulations that eventually get applied to ML stricter and more incisive.
They call this scenario the Habsburg Singularity