You are viewing a single thread.
View all comments
16 points

Is it CSAM if it was produced by AI?

permalink
report
reply
28 points

In this case, yes. Visually indistinguishable from a photo is considered CSAM. We don’t need any new laws about AI to get these assholes. Revenge porn laws and federal CSAM statutes will do.

permalink
report
parent
reply
-5 points
*
Removed by mod
permalink
report
parent
reply
3 points
*

If they can plant AI CSAM in my computer they can also plant “real” CSAM in my computer. Your point doesn’t make any sense.

permalink
report
parent
reply
-20 points
*
Removed by mod
permalink
report
parent
reply
5 points
*

Nothing about your comment addressed why it should be treated differently if it’s ai-generated but visually indistinguishable.

permalink
report
parent
reply
27 points
*

It should be considered illegal if it was used to harm/sexually abuse a child which in this case it was.

Whether it should be classed as CSAM or something separate, I tend to think probably something separate as a revenge porn type law that still allows for distinguishing between this and say a girl whose uncle groomed and sexually abused her while filming it as while this is awful it can (and often does seem) be the product of foolish youth rather than the offender and those involved all being very sick, dangerous, and actually violent offending adult pedophiles victimizing children.

Consider the following:

  1. Underage girl takes a picture of her own genitals, unfortunately classified as the unhelpful and harmful term “child porn” and she can be charged and registered as a sex offender but it’s not CSAM and -shouldn’t- be considered illegal material or a crime (though it is because the west has a vile fixation on puritanism which hurts survivors of childhood sexual trauma as well as adults).

  2. Underage girl takes a picture of her genitals and sends it to her boyfriend, again /shouldn’t/ be CSAM (unfortunately may be charged similarly), she consented and we can assume there wasn’t any unreasonable level of coercion. What it is unfortunately is bound by certain notions of puritanism that are very American.

  3. From 2, boyfriend shares it with other boys, now it’s potentially CSAM or at the least revenge porn of a child as she didn’t consent and it could be used to harm her but punishment has to be modulated with the fact the offender is likely a child himself and not fully able to comprehend his actions.

  4. Underage boy cuts out photo of underage girl he likes, only her face and head, glues it atop a picture of a naked porn actress, maybe a petite one and uses it for his own purposes in private. Not something I think should be classed as CSAM.

  5. Underage boy uses AI to do the same as above but more believably, again I think it’s kind of creepy but if he keeps it to himself and doesn’t show anyone or spread it around it’s just youthful weirdness though really he probably shouldn’t have easy access to those tools.

  6. Underage boy uses AI to do same as 4-5 but this time he spread it around, defaming the girl, she/her friends find out, people say mean things about her, she has to go to school with a bunch of people who are looking and pleasuring themselves to fake but realistic images of herself against her consent which is violating and makes one feel unsafe. Worse probably being bullied for it, mean things, called the s-word, etc.

Kids are weird and do dumb things though unfortunately boys especially in our culture have a propensity to do things that hurt girls far more than the inverse to the point it’s not even really worth talking about girls being creepy or sexually abusive towards peer-aged boys in adolescence and young adulthood. To address this though you need to address patriarchy and misogyny on a cultural level, teach boys empathy and respect for girls and women and frankly do away with all this abusive pornography that’s super prevalent and popular which encourages and perpetuates abusive actions and mentalities towards women and girls, this will never happen in the US however because it’s structurally opposed to being able to do such a thing. Also couldn’t hurt to peel back the stigma and shame around sexuality and nudity in the US which stems from its reactionary Christian culture but again I don’t think that will ever happen in the US as it exists, not this century anyways.

Obviously not getting into adults here as that doesn’t need to be discussed, it’s wrong plain and simple.

Bottom line I think is companies need to be strongly compelled to quickly remove revenge-porn type stuff (regardless of the age of the victim though children can’t deal with this kind of thing as well as adults so the risk of suicide or other self-harm is much higher so it should be treated as higher priority) which this definitely is. It’s abusive and unacceptable and they should fear the credit card companies coming down on them hard and destroying them if they don’t aggressively remove it and ban it and report those sharing it. It should be driven off the clear-web once reported, there should be an image-hash data-set like that used for CSAM (but separate) for such things and major services should use it to stop the spread.

permalink
report
parent
reply
-11 points

I think it’s best to not defend kiddie porn, unless you have a republican senator in your pocket.

permalink
report
parent
reply
17 points

Did you reply to the wrong person or do you just have reading comprehension issues?

permalink
report
parent
reply
12 points
*

i believe in the US for all intents and purposes, it is, especially if it was sourced from a minor, because you don’t really have an argument against that one.

permalink
report
parent
reply
4 points
*
Removed by mod
permalink
report
parent
reply
7 points
*

I’m not sure where you’re going with that? I would argue that yes, it is. As it’s sexual material of a child, with that child’s face on it, explicitly made for the purpose of defaming her. So I would say it sexually abused a child.

But you could also be taking the stance of “AI trains on adult porn, and is mearly recreating child porn. No child was actually harmed during the process.” Which as I’ve said above, I disagree with, especially in this particular circumstance.

Apologies if it’s just my reading comprehension being shit

permalink
report
parent
reply
2 points

it would be material of and or containing child sexual abuse in it.

permalink
report
parent
reply
1 point

Is that the definition of CSAM?

permalink
report
parent
reply
-13 points
*

Is it material that may encourage people to sexually abuse a child?

permalink
report
parent
reply
11 points
*

It’s actually not clear that viewing material leads that person to causing in person abuse

Providing non harmful ways to access the content may lead to less abuse as the content they seek no longer comes from abuse, reducing demand for abusive content.

That being said, this instance isn’t completely fabricated and given its further release is harmful as it it involves a real person and will have emotional impact.

permalink
report
parent
reply
0 points
*
Removed by mod
permalink
report
parent
reply
-15 points
*
Removed by mod
permalink
report
parent
reply
1 point
*
Removed by mod
permalink
report
parent
reply

Technology

!technology@lemmy.ml

Create post

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

Community stats

  • 2.1K

    Monthly active users

  • 1.3K

    Posts

  • 8.4K

    Comments

Community moderators