What’s the academic terminology for “go pound sand”?
So what you’re saying is, don’t beat the targets because fuck those guys. Understood.
Feed the LLM with LLM generated books. No resentment at all!
How many of these books will just be totally garbage nonsense just so they could fulfill a prearranged quota.
Now the LLM are filled with a good amount of nonsense.
Just use the llm to make the books that the llm then uses, what could go wrong?
Someone’s probably already coined the term, but I’m going to call it LLM inbreeding.
In computer science, garbage in, garbage out (GIGO) is the concept that flawed, biased or poor quality (“garbage”) information or input produces a result or output of similar (“garbage”) quality. The adage points to the need to improve data quality in, for example, programming.
There was some research article applying this 70s computer science concept to LLMs. It was published in Nature and hit major news outlets. Basically they further trained GPT on its output for a couple generations, until the model degraded terribly. Sounded obvious to me, but seeing it happen on the www is painful nonetheless…
Soylent Green is a lie anyway. Your need to “soylentify” half the population to feed the other half every year if it would be the only source of calories.