A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

0 points

Sounds like a good feature. Anything that stops people from doing that is great.

But I do have to wonder… were people really expecting to find that content on PornHub? That site certainly seems legit enough that I doubt they’d have that stuff on there. I’d imagine most actual content would be on the dark web and specialty groups, not on PH.

permalink
report
reply
0 points

PH had a pretty big problem with CSAM a few years ago, they ended up wiping ~2/3rds of their user submitted content to try fix it. (Note, they wiped all non-verified user submitted videos, not all of it was CSAM).

And im guessing they are trying to catch users who are trending towards questionable material. “College”✅ -> “Teen”⚠️ -> “Young Teen”⚠️⚠️⚠️ -> "CSAM"🚔 etc.

permalink
report
parent
reply
0 points

That explains why it’s all commercial stuff now… So I heard.

permalink
report
parent
reply
0 points

Sure sure, whatever you say Big Dick :D

permalink
report
parent
reply
0 points

This is one of the more horrifying features of the future of generative AI.

There is literally no stopping it at this stage: AI generated CSAM will be possible soon thanks to systems like SORA.

This is disgusting and awful. But one part of me hopes it can end the black market of real CSAM content forever. By flooding it with infinite fakes, users with that sickness can look at something that didn’t come from a real child’s suffering. It’s the darkest of silver linings I think, but I spoke with many sexual abuse survivors who feel the same about the loli hentai in Japan, in that it could be an outlet for these individuals instead of them finding their own.

Dark topics. But I hope to see more actions like this in the future. If pedos can self isolate from IRL interactions and curb their ways with content that harms no one, then everyone wins.

permalink
report
reply
0 points
*

The question is if consuming AI cp is helping to regulate the pedophiles behavior or if it’s enabling a progression of the condition. As far as I know that is an unanswered question.

permalink
report
parent
reply
0 points
0 points

For porn in general, yes - I think the data is rather clear. But for cp or related substitute content it’s not that definitive (to my knowledge), be it just for the reason that it’s really difficult to collect data on that sensitive topic.

permalink
report
parent
reply
0 points

4.4 million sounds a bit excessive. Facebook marketplace intercepted my search for “unwanted gift” once and insisted I seek help. These things have a lot of false positives.

permalink
report
reply
0 points

What is an “unwanted gift” though?

permalink
report
parent
reply
0 points
*

Probably just looking for deals on new stuff that people dont care about having been gifted.

I could definitely see “unwanted gift” being a code word for trafficking :(

permalink
report
parent
reply
0 points

Lol makes sense. Meta being really meta here, but if thats needed… better too much than too little

permalink
report
parent
reply
0 points

It’s surprising to see Aylo (formerly Mindgeek) coming out with the most ethical use of AI chatbots, especially when Google Gemini cannot even condemn pedophilia.

permalink
report
reply
0 points

In the link you shared, Gemini gave a nuanced answer. What would you rather it say?

permalink
report
parent
reply
0 points

Are you defending pedophilia? This is a honest question because you are saying it gave a nuanced answer when we all, should, know that it’s horribly wrong and awful.

permalink
report
parent
reply
0 points

What you are thinking about is child abuse. A pedophile is not bound to bcome an abuser.

permalink
report
parent
reply
0 points

Google does this too, my wife was searching for “slutty schoolgirl” costumes and Google was like “have a seat ma’am”

permalink
report
reply
0 points

Big tech is teaching us about morality.

permalink
report
parent
reply
0 points

I do have to agree with them on that one. Fetishizing school uniforms worn by children gives some serious Steven Tyler vibes.

permalink
report
parent
reply
0 points

Sexuality is tightly connected to societal taboos, as long as everyone involved is a consenting adult - it’s no-one else businesses. There is no need or benefit in moralizing peoples sexuality.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 6K

    Posts

  • 128K

    Comments