You are viewing a single thread.
View all comments
0 points

Mhm I have mixed feelings about this. I know that this entire thing is fucked up but isn’t it better to have generated stuff than having actual stuff that involved actual children?

permalink
report
reply
0 points

The arrest is only a positive. Allowing pedophiles to create AI CP is not a victimless crime. As others point out it muddies the water for CP of real children, but it also potentially would allow pedophiles easier ways to network in the open (if the images are legal they can easily be platformed and advertised), and networking between abusers absolutely emboldens them and results in more abuse.

As a society we should never allow the normalization of sexualizing children.

permalink
report
parent
reply
0 points
*

Actually, that’s not quite as clear.

The conventional wisdom used to be, (normal) porn makes people more likely to commit sexual abuse (in general). Then scientists decided to look into that. Slowly, over time, they’ve become more and more convinced that (normal) porn availability in fact reduces sexual assault.

I don’t see an obvious reason why it should be different in case of CP, now that it can be generated.

permalink
report
parent
reply
0 points

It should be different because people can not have it. It is disgusting, makes them feel icky and thats just why it has to be bad. Conventional wisdom sometimes really is just convential idiocracy.

permalink
report
parent
reply
0 points

You know whats better? Having none of this shit

permalink
report
parent
reply
0 points

Yeah as I also said.

permalink
report
parent
reply
0 points

A problem that I see getting brought up is that generated AI images makes it harder to notice photos of actual victims, making it harder to locate and save them

permalink
report
parent
reply
0 points

Depends on what kind of images. 99% of CP is just kids naked on the beach or teens posting pictures of themselves. And especially with the latter, there’s no one to save nor does it really harm anyone nor should it be as illegal as the actual 1% rape footage. And even said rape footage is just the same stuff again and again, often decades old. And off of that, I don’t think any AI could produce “usable” material.

And of course, the group violating the law in this regard the most are the kids/teens themselves, sending nudes or uploading them to forums.

permalink
report
parent
reply
0 points

Did we memory hole the whole ‘known CSAM in training data’ thing that happened a while back? When you’re vacuuming up the internet you’re going to wind up with the nasty stuff, too. Even if it’s not a pixel by pixel match of the photo it was trained on, there’s a non-zero chance that what it’s generating is based off actual CSAM. Which is really just laundering CSAM.

permalink
report
parent
reply
0 points

I didn’t know that, my bad.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 14K

    Monthly active users

  • 6.8K

    Posts

  • 157K

    Comments