It’s actually not clear that viewing material leads that person to causing in person abuse
Providing non harmful ways to access the content may lead to less abuse as the content they seek no longer comes from abuse, reducing demand for abusive content.
That being said, this instance isn’t completely fabricated and given its further release is harmful as it it involves a real person and will have emotional impact.
There’s other instances where it was completely fabricated, and the courts ruled it was CSAM and convicted
There has been yes, but it doesn’t mean it’s the right ruling law. The law varies on that by jurisdiction as well because it is a murky area.
Edit: in the USA it might not even be illegal unless there was intent to distribute
By the statute’s own terms, the law does not make all fictional child pornography illegal, only that found to be obscene or lacking in serious value. The mere possession of said images is not a violation of the law unless it can be proven that they were transmitted through a common carrier, such as the mail or the Internet, transported across state lines, or of an amount that showed intent to distribute.[
So local AI generating fictional material that is not distributed may be okay federally in the USA.