A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

85 points

This creates a significant legal issue - AI generated images have no age, nor is there consent.

The difference in appearance between age 16 and 18 is minimal, but the legal difference is immense. This is based entirely on a concept that cannot apply.

How do you define what’s depicting a fictional child? Especially without including real adults? I’ve met people who believe that preferring a shaved pubic area is pedophilia. This is even though the vast majority of adult women do so. On the flip side, teenagers from the 70s and 80s would be mistaken for 40+ today.

Even the extremes aren’t clear. Adult star “Little Lupe”, who was 18+ in every single appearance, lacked most secondary sex characteristics. Experts testified in court that she could not possibly be an adult. Except she was, and there’s full documentation to prove it. Would AI trained exclusively on her work be producing CSAM?

permalink
report
reply
-23 points
*

To paraphrase someone smarter than me, “I’ll know it when I see it.”

But naturally I don’t want to see it. One of the things I miss the least about reddit is the constant image posts of anime characters, who may be whatever age they say but which are clearly representative of very young girls with big tiddies bolted on. It’s gross, but it is also a problem thatsl’s more widespread and nebulous than most people are willing to admit.

permalink
report
parent
reply
101 points

“I’ll know it when I see it.”

I can’t think of anything scarier than that when dealing with the legality of anything.

permalink
report
parent
reply
27 points

I’m nearly 40 and still regularly get carded while other people out with me do not so it’s not just “we card everyone”. People are bad at judging age.

permalink
report
parent
reply
18 points

https://en.m.wikipedia.org/wiki/I_know_it_when_I_see_it

They really downplayed the criticism of the phrase in the article, it’s actually criticised quite often for being so subjective.

permalink
report
parent
reply
3 points

Sometimes something cant have a perfect definition. What’s the difference between a gulf, a bay, and a channel? Where does the shore line become a beach? When does an arid prairie become a desert? How big does a town have to grow before it becomes a city? At what point does a cult bevota religion?When does a murder become premeditated vs a crime of passion? When does a person become too drunk to give active consent? Human behavior is a million shades of gray, just like everytbing else we do, and the things that don’t fit into our clear definitions are where the law needs to be subjective.

permalink
report
parent
reply
17 points

Just when trying to guess someone’s age (we’ll assume completely family-friendly and above board), think back to high school. How old did you and your peers look? Now go take a look at high schoolers today. They probably seem a lot younger than you did. The longer it’s been (i.e. the older you are), the younger they look. Which means, “when I see it” depends entirely on the age of the viewer.

This isn’t even just about perception and memory- modern style is based on/influenced heavily by youth. It’s also continuing to move in the direction. This is why actors in their 30s - with carefully managed hair, skin, makeup, and wardrobe - have been able to convincingly portray high schoolers. This means that it’s not just you - teens really are looking younger each year. But they’re still the same age.

permalink
report
parent
reply
3 points

Wtf. Style is what makes kids look young or old to us because we have been heavily marketed to and follow trends. That’s why when the mullet/porn stache style came back, those Dahmer kids looked in their 40s.

You’re getting older each year so teens look younger to you.

Name even one actor in their thirties who convincingly played a high schooler. Literally who

permalink
report
parent
reply
73 points

I don’t see how children were abused in this case? It’s just AI imagery.

It’s the same as saying that people get killed when you play first person shooter games.

Or that you commit crimes when you play GTA.

permalink
report
reply
33 points

Then also every artist creating loli porn would have to be jailed for child pornography.

permalink
report
parent
reply
8 points
*
Removed by mod
permalink
report
parent
reply
19 points

But this is the US… and its kind of a double standard if you’re not arrested for drawing but for generating it.

permalink
report
parent
reply
-8 points
*

Not a great comparison, because unlike withh violent games or movies, you can’t say that there is no danger to anyone in allowing these images to be created or distributed. If they are indistinguishable from the real thing, it then becomes impossible to identify actual human victims.

There’s also a strong argument that the availability of imagery like this only encourages behavioral escalation in people who suffer from the affliction of being a sick fucking pervert pedophile. It’s not methadone for them, as some would argue. It’s just fueling their addiction, not replacing it.

permalink
report
parent
reply
-21 points

The difference is intent. When you’re playing a FPS, the intent is to play a game. When you play GTA the intent is to play a game.

The intent with AI generated CSAM is to watch kids being abused.

permalink
report
parent
reply
29 points

Whose to say there aren’t people playing games to watch people die?

permalink
report
parent
reply
-6 points

There may well be the odd weirdo playing Call of Duty to watch people die.

But everyone who watches CSAM is watching it to watch kids being abused.

permalink
report
parent
reply
10 points

When you’re playing a FPS, the intent is to watch people being murdered.

How is this argument any different?

permalink
report
parent
reply
3 points

Punishing people for intending to do something is punishing them for thought crimes. That is not the world I want to live in.

permalink
report
parent
reply
-1 points

This guy did do something - he either created or accessed AI generated CSAM.

permalink
report
parent
reply
-35 points

It’s just AI imagery.

Fantasising about sexual contact with children indicates that this person might groom children for real, because they have a sexual interest in doing so. As someone who was sexually assaulted as a child, it’s really not something that needs to happen.

permalink
report
parent
reply
81 points

indicates that this person might groom children for real

But unless they have already done it, that’s not a crime. People are prosecuted for actions they commit, not their thoughts.

permalink
report
parent
reply
65 points

I agree, this line of thinking quickly spirals into Minority Report territory.

permalink
report
parent
reply
21 points

Seems like then fantasizing about shooting people or carjacking or such indcates that person might do that activity for real to. There are a lot of car jackings nowadays and you know gta is real popular. mmmm. /s but seriously im not sure your first statement has merit. Especially when you look at where to draw the line. anime. manga. oil paintings. books. thoughts in ones head.

permalink
report
parent
reply
-10 points

If you’re asking whether anime, manga, oil paintings, and books glorifying the sexualization of children should also be banned, well, yes.

This is not comparable to glorifying violence, because real children are victimized in order to create some of these images, and the fact that it’s impossible to tell makes it even more imperative that all such imagery is banned, because the existence of fakes makes it even harder to identify real victims.

It’s like you know there’s an armed bomb on a street, but somebody else filled the street with fake bombs, because they get off on it or whatever. Maybe you’d say making fake bombs shouldn’t be illegal because they can’t harm anyone. But now suddenly they have made the job of law enforcement exponentially more difficult.

permalink
report
parent
reply
-15 points

If you want to keep people who fantasise about sexually exploiting children around your family, be my guest. My family tried that, and I was raped. I didn’t like that, and I have drawn my own conclusions.

permalink
report
parent
reply
-37 points

Well, the image generator had to be trained on something first in order to spit out child porn. While it may be that the training set was solely drawn/rendered images, we don’t know that, and even if the output were in that style, it might very well be photorealistic images generated from real child porn and run through a filter.

permalink
report
parent
reply
47 points

An AI that is trained on children and nude adults can infer what a nude child looks like without ever being trained specifically with those images.

permalink
report
parent
reply
-13 points

Your argument is hypothetical. Real world AI was trained on images of abused childen.

https://cyber.fsi.stanford.edu/news/investigation-finds-ai-image-generation-models-trained-child-abuse

permalink
report
parent
reply
46 points
*

How many corn dogs do you think were in the training data?

permalink
report
parent
reply
6 points

Wild corn dogs are an outright plague where I live. When I was younger, me and my buddies would lay snares to catch to corn dogs. When we caught one, we’d roast it over a fire to make popcorn. Corn dog cutlets served with popcorn from the same corn dog is popular meal, especially among the less fortunate. Even though some of the affluent consider it the equivalent to eating rat meat. When me pa got me first rifle when I turned 14, I spent a few days just shooting corn dogs.

permalink
report
parent
reply
-1 points
*

It didn’t generate what we expect and know a corn dog is.

Hence it missed because it doesn’t know what a “corn dog” is

You have proven the point that it couldn’t generate csam without some being present in the training data

permalink
report
parent
reply
33 points

Just say you don’t get how it works.

permalink
report
parent
reply
16 points
*

we don’t know that

might

Unless you’re operating under “guilty until proven innocent”, those are not reasons to accuse someone.

permalink
report
parent
reply
-38 points

How was the model trained? Probably using existing CSAM images. Those children are victims. Making derivative images of “imaginary” children doesn’t negate its exploitation of children all the way down.

So no, you are making false equivalence with your video game metaphors.

permalink
report
parent
reply
56 points

A generative AI model doesn’t require the exact thing it creates in its datasets. It most likely just combined regular nudity with a picture of a child.

permalink
report
parent
reply
-13 points

In that case, the images of children were still used without their permission to create the child porn in question

permalink
report
parent
reply
27 points

Can you or anyone verify that the model was trained on CSAM?

Besides a LLM doesn’t need to have explicit content to derive from to create a naked child.

permalink
report
parent
reply
6 points

No they are not.

permalink
report
parent
reply
-24 points

You’re defending the generation of CSAM pretty hard here in some vaguely “but no child we know of” being involved as a defense.

permalink
report
parent
reply
11 points

While i wouldn’t put it past Meta&Co. to explicitly seek out CSAM to train their models on, I don’t think that is how this stuff works.

permalink
report
parent
reply
2 points

Wrong.

permalink
report
parent
reply
-5 points

But the AI companies insist the outputs of these models aren’t derivative works in any other circumstances!

permalink
report
parent
reply
6 points

Cuz they’re not

permalink
report
parent
reply
61 points

Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go “ok, itch scratched”, and tank the demand for the real stuff.

Depending on which way it goes, it could be massively helpful for protecting kids. I just don’t have a sense for what the effect would be, and I’ve never seen any experts weigh in.

permalink
report
reply
33 points

Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go “ok, itch scratched”, and tank the demand for the real stuff.

From bits/articles I’ve seen here and there over the years about other things that are kind of in the same category (porn comics with child characters in them, child-shaped sex dolls), the latter seems to be more the case.

I’m reminded of when people were arguing that when Internet porn became widespread, the incidence of rape would go through the roof. And then literally the opposite happened. So…that pushes me toward hypothesizing that the latter is more likely to be the case, as well.

permalink
report
parent
reply
22 points

In Australia cartoon child porn is enforced in the same way as actual child porn. Not that it answers your question but it’s interesting.

I’d imagine for your question “it depends”, some people who would have acted on their urges may get their jollies from AI child porn, others who have never considered being pedophiles might find the AI child porn (assuming legal) and realise it’s something they were into.

I guess it may lower the production of real child porn which feels like a good thing. I’d hazard a guess that there are way more child porn viewers than child abusers.

permalink
report
parent
reply
10 points

In Australia a 30 year old woman cannot be in the porn industry if she has small breasts. That, and the cartoon ban both seem like overcompensating.

permalink
report
parent
reply
12 points

Nothing says “we’re protecting children” like regulating what adult women can do with their bodies.

Conservatives are morons, every time.

permalink
report
parent
reply
16 points

I seem to remember Sweden did a study on this, but I don’t really want to google around to find it for you. Good luck!

permalink
report
parent
reply
13 points

I’d like to know what psychologists think about it. My assumption is the former, it escalates their fantasizing about it and makes them more likely to attack a child.

There seems to be no way to conduct that experiment ethically, though.

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
0 points
Deleted by creator
permalink
report
parent
reply
13 points

Real question: “do we care if AI child porn is bad?” Based on most countries’ laws, no.

permalink
report
parent
reply
12 points

There’s like a lot of layers to it.

  • For some, it might actually work in the opposite direction, especially if paried with the wrong kind of community around it. I used to moderate anime communities, the amount of loli fans wanting to lower the age of consent to 12 or even lower was way too high, but they only called people opposed to loli as “real predators”, because they liked their middle-school tier arguments (which just further polarized the fandom when the culture wars started).
  • Even worse might be the more realistic depictions might actually work against that goal, while with (most) loli stuff, at least it’s obvious it’s drawn.
  • An often overseen issue is, data laundering. Just call your real CP AI generated, or add some GAI artifacts to your collection. Hungary bans too realistic drawings and paintings of that kind, because people even did that with traditional means, by creating as realistic tracings as possible (the calling CP “artistic nudes” didn’t work out here at least).
permalink
report
parent
reply
8 points

In Canada even animated cp is treated as the real deal

permalink
report
parent
reply
12 points
*

In Norway, imagining or describing acts with a 16-year old is CP, but having sex with a 16-year old is perfectly legal

permalink
report
parent
reply
3 points

Lol damn it Norway

permalink
report
parent
reply
5 points

You’re missing the point. They don’t care what’s more or less effective for helping kids. They want to punish people who are different. In this case nobody is really going to step up to defend the guy for obvious reasons. But the motivating concept is the same for conservatives.

permalink
report
parent
reply
4 points

There definitively is opportunity in controlled treatment. But I believe outside of that there are too many unknowns.

permalink
report
parent
reply
4 points

Wikipedia seems to suggest research is inconclusive whether consuming CSAM increases the likelihood of committing abuse.

permalink
report
parent
reply
3 points
*
Deleted by creator
permalink
report
parent
reply
-19 points

Depending on which way it goes, it could be massively helpful for protecting kids

Weeeelll, only until the AI model needs more training material…

permalink
report
parent
reply
20 points

That’s not how it works. The “generative” in “generative AI” is there for a reason.

permalink
report
parent
reply
5 points

You need more training material to train a new AI. Once the AI is there, it produce as many pictures as you want. And you can get good results even with models that can be run locally on a regular computer.

permalink
report
parent
reply
-2 points

I’m not sure if that is how it would work? But this is exactly the kind of thinking we need. Effects: intended plus unintended equals ???

permalink
report
parent
reply
56 points

Could this be considered a harm reduction strategy?

Not that I think CSAM is good in any way, but if it saves a child would it be worthwhile? Like if these pedos were to use AI images instead of actual CSAM would that be any better?

I’ve read that CSAM sites on the dark web number into the hundreds of thousands. I just wonder if it would be a less harmful thing since it’s such a problem.

permalink
report
reply
38 points

Many years ago (about 25) I read an article in a newspaper (idk the name, but it may have been the The Computer Paper, which is archived on line someplace}. This article noted that a study had been commissioned to show that cp access increases child abuse. The study seemed to show the opposite.

Here’s the problem with even AI generated cp: It might lower abuse in the beginning, but with increased access it would ‘normalise’ the perception of such conduct. This would likely increase abuse over time, even involving persons who may not have been so inclined otherwise.

This is all a very complex. A solution isn’t simple. Shunning things in anyway won’t help though, and that seems to be the current most popular way to deal with the issue.

permalink
report
parent
reply
25 points

Actual pedophiles (a lot of CSA is abuse of power, not pedophilia - though to be clear fuck abusers either way) have a high rate of suicidal ideation because they think its as fucked up as everyone else. Of course we can’t just say “sure AI material is legal now” but I could imagine a regulated system accessed via doctors akin to how controlled substances work.

People take this firm “kill em all” stance but these people just feel the way they do same as I do towards women or a gay man feels toward men. It just is what it is - we all generally agree gay isnt a choice and this is no different. As long as they dont act on it, I think we should be sympathetic and be open to helping them live a less tortured life.

I’m not 100% saying this is how we do it, but we should be open to exploring the issue instead of full stop demonization.

permalink
report
parent
reply
11 points

Dan Savage coined the term “gold star pedophile” in a column years ago, referring to people who acknowledge their attraction to children but never act on it by harming a child or accessing CSAM. I do feel bad for these people because there are no resources to help them. The only way they can access actual therapeutic resources for their condition is by offending and going to jail. If the pedophile goes to a therapist and confesses attraction to children, therapists are mandated reporters and will assume they’re going to act on it. An article I read a few years back interviewed members of an online community of non-offending pedophiles who essentially made their own support group since no one else will help them, and nearly all research on them is from a forensic (criminal) context.

There’s a pretty good article by James Cantor talking about dealing with pedophiles in a therapeutic context here.

Don’t get me wrong - I think offenders need to be punished for what they do. I unfortunately have a former best friend who has offended. He’s no longer in my life and never will be again. But I think we could prevent offenders from reaching that point and hurting someone if we did more research and found ways to stop them before it happened.

permalink
report
parent
reply
6 points

I agree for the most part, particularly that we should be open minded.

Obviously we don’t have much reliable data, which I think is critically important.

The only thing I world add is that, I’m not sure treating a desire for CSAM would be the same as substance abuse. Like “weaning an addict off CSAM” seems like a strange proposition to me.

permalink
report
parent
reply
18 points

“Normalized” violent media doesn’t seem to have increased the prevalence of real world violence.

permalink
report
parent
reply
4 points

I actually think video games reduce crime in general. Bad kids are now indoors getting thier thrills.

permalink
report
parent
reply
2 points

That makes sense. I don’t know what a better answer is, just thinking out loud.

permalink
report
parent
reply
13 points

You would think so but you basically are making a patch work version of the illicit actual media so it’s a dark dark gray area for sure.

permalink
report
parent
reply
2 points

Hmm ok. I don’t know much about AI.

permalink
report
parent
reply
4 points

Generative AI is basically just really overpowered text/image prediction. It fills in the words or pixels that make the most sense based on the data it has been fed, so to get AI generated CSAM…it had to have been fed some amount of CSAM at some point or it had to be heavily manipulated to generate the images in question.

permalink
report
parent
reply
7 points
*

I guess my question is does access to regular porn make people not want to have real sex with another person? Does it ‘scratch the itch’ so to speak? Could they go the rest of their life with only porn to satisfy them?

It depends on the person. I feel like most people would be unsatisfied with only porn, but that’s just anecdotal.

I honestly think ai generated csam isn’t something the world needs to be produced. It’s not contributing to society in any meaningful ways and pedophiles who don’t offend or hurt children need therapy, and the ones who do need jailtime(and therapy, but Im in the US so thats a whole other thing). They don’t ‘need’ porn.

My own personal take is that giving pedophiles csam that’s AI generated is like showing alcohol ads to alcoholics. Or going to the strip club if you’re a sex addict. It’s probably not going to lead to good outcomes.

permalink
report
parent
reply
2 points
*
Deleted by creator
permalink
report
parent
reply
1 point

You definitely have a good point. I was just thinking hopefully to reduce harm but obviously I don’t want it to be legal.

permalink
report
parent
reply
0 points

“Because fuck that” is not a great argument.

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
0 points

by the same metric, i wonder why not let convicts murderers and psichopaths work at Slaughterhouses

permalink
report
parent
reply
4 points

On the other hand, are people who work at slaughterhouses more likely to be murderers and psychopaths?

permalink
report
parent
reply
2 points

perhaps, but I said convicted.

permalink
report
parent
reply
39 points

Lolicon fans in absolute shambles.

permalink
report
reply
6 points

CANNED IN BANADA

permalink
report
parent
reply

News

!news@lemmy.world

Create post

Welcome to the News community!

Rules:

1. Be civil

Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.

Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.

Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.

Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.

Posts must be news from the most recent 30 days.


6. All posts must be news articles.

No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.

If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.

Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.

The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body

For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

Community stats

  • 14K

    Monthly active users

  • 10K

    Posts

  • 197K

    Comments