A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

11 points

Show me multiple (let’s say 3+) small-scale independent academic studies or 1-2 comprehensive & large academic studies that support one side or another and I may be swayed, Otherwise I think all that is being accomplished is that one guys life is getting completely ruined for now and potentially forever over some fabrications and as a result he may or may not get help, but I doubt he’ll be better off.

—My understanding was that csam has it’s legal status specifically because there are victims that are hurt by these crimes and possession supports a broader market that faciltates said harm to these victims. It’s not as easy to make a morality argument (especially a good one) for laws that effect everybody when there are no known victims.

permalink
report
reply
-2 points

Are you stupid? Something has to be in the training model for any generation to be possible. This is just a new way to revitalise kids

permalink
report
parent
reply
2 points

Not necessarily, AI can do wild things with combined attributes.

That said, I do feel very uncomfortable with the amount of defense of this guy, he was distributing this to people. If he was just generating fake images of fake people using legal training data in his own house for his own viewing, that would be a different story. The amount of people jumping in front of the bullet for this guy when we don’t really know the details is the larger problem.

permalink
report
parent
reply
3 points
*

So are you suggesting they can get an unaltered facial I.D. of the kids in the images? —Because that makes it regular csam with a specific victim (as mentioned), not an ai generative illustration.

permalink
report
parent
reply
-5 points

No, I am telling you csam images can’t be generated by an algorithm that hasn’t trained on csam

permalink
report
parent
reply
3 points

Dude is gonna get fucked, but someone had to be the test case. Hopefully this gets some legal clarity.

permalink
report
parent
reply
25 points

If this thread (and others like it) have taught me aulnything is that facts be damned, people are opinionated either way. Nuance means nothing and it’s basically impossible to have a proper discussion when it comes to wedge issues or anything that can be used to divide people. Even if every study 100% said Ai generated csam always led to a reduction in actual child harm and reduced recidivism and never needed any actual real children to be used as training material, the comments would still pretty much look the same. If the studies showed the exact opposite, the comments would also be the same. Welcome to the internet. I hope you brought aspirin.

permalink
report
reply
4 points
*

I was hoping to comment on this post multiple times today after I initially lost track of It and now I see you’ve covered about 75% of what I wanted to say. I’ll post the rest elsewhere out of politeness. Thank you

permalink
report
parent
reply
7 points

My man. Go touch some grass. This place is no good. Not trying to insult you but it’s for your mental health. These Redditors aren’t worth it.

permalink
report
parent
reply
3 points

A lot of the places I’ve been to start conversation have been hostile and painful. If there is one thing that stands out that’s holding Lemmy back it’s the shitty culture this place can breed.

permalink
report
parent
reply
2 points

Have you ever been to Reddit? This is heaven

permalink
report
parent
reply
2 points
*

I’m convinced that a lot can be inferred from the type of reactions and the level of hostility one might receive by trying to present a calm and nuanced argument to a wedge topic. Even if it’s not always enjoyable. At the very least it also shows others that they may not be interacting rational actors when one gets their opponents to go full mask-off.

permalink
report
parent
reply
4 points

Actually. I needed that. Thanks. Enough internet for me today.

permalink
report
parent
reply
14 points

He wasn’t arrested for creating it, but for distribution.

If dude just made it and kept it privately, he’d be fine.

I’m not defending child porn with this comment.

permalink
report
reply
-13 points
*
Deleted by creator
permalink
report
parent
reply
6 points

Now I’m imagining you making child porn

permalink
report
parent
reply
9 points

I must admit, amount of comments that are defending AI images as not child porn is truly shocking.

In my book, sexual images of children are not okay, AI generated or otherwise. Pedophiles need help, counseling and therapy. Not images that enable something I think is not acceptable in society.

I truly do believe that AI images should be subject to same standards as regular images in what content we deem appropriate or not.

Yes, this can be used to wrongfully prosecute innocent people, but it does not mean that we should freely allow AI-CP.

permalink
report
reply
13 points

I generally think if something is not causing harm to others, it shouldn’t be illegal. I don’t know if “generated” CSAM causes harm to others though. I looked it up and it appears the research on whether CSAM consumption increases the likelihood of a person committing child abuse is inconclusive.

permalink
report
parent
reply
1 point

You’re not kidding.

The only possible way I could see a defense if it were something like “AI CSAM results in a proven reduction of actual CSAM”.

But. The defenses aren’t even that!

They’re literally saying that CSAM is okay. I’m guessing a lot of these same comments would argue that deepfakes are okay as well. Just a completely fucked up perspective.

permalink
report
parent
reply
0 points

Pedophiles need help, counseling and therapy. Not images that enable something I think is not acceptable in society.

I mean 30-40 years ago you could replace the word pedophile with homosexual and a vast majority of people would agree. I’m not defending pedophilia here but it’s important to remember these people are born the way they are. Nothing is going to change that, new pedophiles are born every day. They will never go away. The same way you can’t change gay or transgender people. Repressing sexual desire never works look at the priests in the Catholic Church. A healthy outlet such as AI generated porn could save a lot of actual children from harm. I think that should be looked into.

permalink
report
parent
reply
-1 points

I would like to know what source you have for claiming that pedophiles are “born the way they are.”

We understand some of the genetic and intrauterine developmental reasons for homosexuality, being trans, etc. That has scientific backing, and our understanding continues to grow and expand.

Lumping child predators in with consenting adults smacks of the evangelical slippery slope argument against all forms of what they consider to be “sexual deviance.” I’m not buying it.

permalink
report
parent
reply
4 points

Look, I get what you are saying and I do agree. However, I don’t think that comparing pedophilic relations to LGBTQ struggles is fair. One is consented relationship between consenting adults, other is exploitation and high probability of setup for lifelong mental struggles from young age.

permalink
report
parent
reply
-4 points

the number of people willing to bat for this on Lemmy is truly disturbing. what do they think these ai models are trained on?

permalink
report
parent
reply
11 points

No necessarily is trained on CP, could be trained with images of children (already fuck up, who gave them that permission?) and pornography.

permalink
report
parent
reply
3 points

The article pointed out that stable diffusion was trained using a dataset containing CSAM

permalink
report
parent
reply
0 points

Cant speak for others but I agree that AI-CP should be illegal.

The question is how do we define the crime with our current laws? It does seem like we need a new law to address AI images. Both for things like AI-CP, revenge porn, and slanderous/misleading photos. (The Communist Harris and Trump with black people photos)

Where do we draw the line?
How do we regulate it?
Forced watermarks/labels on all tools?
Jail time? Fines?
Forced correction notices? (Doesn’t work for the news!)

This is all a slippery slope but what I can say is I hope this goes to court. He looses. Appeals. Then it goes all the way up to federal so we can have a standard to point to.

The shit wrong.
Step one in fixing shit.

permalink
report
parent
reply
8 points

Iirc he was prosecuted under a very broad “obscenity” law, which should terrify everyone.

permalink
report
parent
reply
6 points
*

Agreed, especially considering it will eventually become indistinguishable.

permalink
report
parent
reply
6 points

If no children were involved in the production of porn, how is it pedophilic? That’s like claiming a picture of water has the same properties as water.

permalink
report
reply
1 point
*

It’s pedophillic because it’s sexual images of children; fake or not does not change that. Drawn images of child pornography are still pedophillic.

The more important question is, is it CSAM? Whether drawn images that represent no real child are or not depends on the legal jurisdiction. Should drawn and AI generated child porn be legal or banned? I think the actual answer to that would require significant research into whether its existence acts as an outlet to prevent pedophiles from harming actual children, or whether it encourages their proclivities and makes them more likely to hurt actual children.

Preventing harm to children should be the goal, but actual research of the effects of simulated child porn vis-a-vis pedophiles harming actual children is as unlikely to happen as any other research into pedophilia

permalink
report
parent
reply
6 points

However, a picture of water makes me thirsty. But then again, there is no substitute for water.

I am not defending pedos, or defending Florida for doing something like that.

permalink
report
parent
reply
3 points

That might be a you thing. Pictures of water dont make me thirsty. I get the metaphor you are attempting to make though.

permalink
report
parent
reply

News

!news@lemmy.world

Create post

Welcome to the News community!

Rules:

1. Be civil

Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.

Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.

Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.

Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.

Posts must be news from the most recent 30 days.


6. All posts must be news articles.

No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.

If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.

Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.

The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body

For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

Community stats

  • 14K

    Monthly active users

  • 10K

    Posts

  • 197K

    Comments