A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.
Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.
This creates a significant legal issue - AI generated images have no age, nor is there consent.
The difference in appearance between age 16 and 18 is minimal, but the legal difference is immense. This is based entirely on a concept that cannot apply.
How do you define what’s depicting a fictional child? Especially without including real adults? I’ve met people who believe that preferring a shaved pubic area is pedophilia. This is even though the vast majority of adult women do so. On the flip side, teenagers from the 70s and 80s would be mistaken for 40+ today.
Even the extremes aren’t clear. Adult star “Little Lupe”, who was 18+ in every single appearance, lacked most secondary sex characteristics. Experts testified in court that she could not possibly be an adult. Except she was, and there’s full documentation to prove it. Would AI trained exclusively on her work be producing CSAM?
To paraphrase someone smarter than me, “I’ll know it when I see it.”
But naturally I don’t want to see it. One of the things I miss the least about reddit is the constant image posts of anime characters, who may be whatever age they say but which are clearly representative of very young girls with big tiddies bolted on. It’s gross, but it is also a problem thatsl’s more widespread and nebulous than most people are willing to admit.
“I’ll know it when I see it.”
I can’t think of anything scarier than that when dealing with the legality of anything.
I’m nearly 40 and still regularly get carded while other people out with me do not so it’s not just “we card everyone”. People are bad at judging age.
https://en.m.wikipedia.org/wiki/I_know_it_when_I_see_it
They really downplayed the criticism of the phrase in the article, it’s actually criticised quite often for being so subjective.
Sometimes something cant have a perfect definition. What’s the difference between a gulf, a bay, and a channel? Where does the shore line become a beach? When does an arid prairie become a desert? How big does a town have to grow before it becomes a city? At what point does a cult bevota religion?When does a murder become premeditated vs a crime of passion? When does a person become too drunk to give active consent? Human behavior is a million shades of gray, just like everytbing else we do, and the things that don’t fit into our clear definitions are where the law needs to be subjective.
Just when trying to guess someone’s age (we’ll assume completely family-friendly and above board), think back to high school. How old did you and your peers look? Now go take a look at high schoolers today. They probably seem a lot younger than you did. The longer it’s been (i.e. the older you are), the younger they look. Which means, “when I see it” depends entirely on the age of the viewer.
This isn’t even just about perception and memory- modern style is based on/influenced heavily by youth. It’s also continuing to move in the direction. This is why actors in their 30s - with carefully managed hair, skin, makeup, and wardrobe - have been able to convincingly portray high schoolers. This means that it’s not just you - teens really are looking younger each year. But they’re still the same age.
Wtf. Style is what makes kids look young or old to us because we have been heavily marketed to and follow trends. That’s why when the mullet/porn stache style came back, those Dahmer kids looked in their 40s.
You’re getting older each year so teens look younger to you.
Name even one actor in their thirties who convincingly played a high schooler. Literally who
I don’t see how children were abused in this case? It’s just AI imagery.
It’s the same as saying that people get killed when you play first person shooter games.
Or that you commit crimes when you play GTA.
Then also every artist creating loli porn would have to be jailed for child pornography.
But this is the US… and its kind of a double standard if you’re not arrested for drawing but for generating it.
Not a great comparison, because unlike withh violent games or movies, you can’t say that there is no danger to anyone in allowing these images to be created or distributed. If they are indistinguishable from the real thing, it then becomes impossible to identify actual human victims.
There’s also a strong argument that the availability of imagery like this only encourages behavioral escalation in people who suffer from the affliction of being a sick fucking pervert pedophile. It’s not methadone for them, as some would argue. It’s just fueling their addiction, not replacing it.
The difference is intent. When you’re playing a FPS, the intent is to play a game. When you play GTA the intent is to play a game.
The intent with AI generated CSAM is to watch kids being abused.
Punishing people for intending to do something is punishing them for thought crimes. That is not the world I want to live in.
This guy did do something - he either created or accessed AI generated CSAM.
It’s just AI imagery.
Fantasising about sexual contact with children indicates that this person might groom children for real, because they have a sexual interest in doing so. As someone who was sexually assaulted as a child, it’s really not something that needs to happen.
indicates that this person might groom children for real
But unless they have already done it, that’s not a crime. People are prosecuted for actions they commit, not their thoughts.
I agree, this line of thinking quickly spirals into Minority Report territory.
Seems like then fantasizing about shooting people or carjacking or such indcates that person might do that activity for real to. There are a lot of car jackings nowadays and you know gta is real popular. mmmm. /s but seriously im not sure your first statement has merit. Especially when you look at where to draw the line. anime. manga. oil paintings. books. thoughts in ones head.
If you’re asking whether anime, manga, oil paintings, and books glorifying the sexualization of children should also be banned, well, yes.
This is not comparable to glorifying violence, because real children are victimized in order to create some of these images, and the fact that it’s impossible to tell makes it even more imperative that all such imagery is banned, because the existence of fakes makes it even harder to identify real victims.
It’s like you know there’s an armed bomb on a street, but somebody else filled the street with fake bombs, because they get off on it or whatever. Maybe you’d say making fake bombs shouldn’t be illegal because they can’t harm anyone. But now suddenly they have made the job of law enforcement exponentially more difficult.
If you want to keep people who fantasise about sexually exploiting children around your family, be my guest. My family tried that, and I was raped. I didn’t like that, and I have drawn my own conclusions.
Well, the image generator had to be trained on something first in order to spit out child porn. While it may be that the training set was solely drawn/rendered images, we don’t know that, and even if the output were in that style, it might very well be photorealistic images generated from real child porn and run through a filter.
An AI that is trained on children and nude adults can infer what a nude child looks like without ever being trained specifically with those images.
Your argument is hypothetical. Real world AI was trained on images of abused childen.
Wild corn dogs are an outright plague where I live. When I was younger, me and my buddies would lay snares to catch to corn dogs. When we caught one, we’d roast it over a fire to make popcorn. Corn dog cutlets served with popcorn from the same corn dog is popular meal, especially among the less fortunate. Even though some of the affluent consider it the equivalent to eating rat meat. When me pa got me first rifle when I turned 14, I spent a few days just shooting corn dogs.
It didn’t generate what we expect and know a corn dog is.
Hence it missed because it doesn’t know what a “corn dog” is
You have proven the point that it couldn’t generate csam without some being present in the training data
How was the model trained? Probably using existing CSAM images. Those children are victims. Making derivative images of “imaginary” children doesn’t negate its exploitation of children all the way down.
So no, you are making false equivalence with your video game metaphors.
A generative AI model doesn’t require the exact thing it creates in its datasets. It most likely just combined regular nudity with a picture of a child.
In that case, the images of children were still used without their permission to create the child porn in question
Can you or anyone verify that the model was trained on CSAM?
Besides a LLM doesn’t need to have explicit content to derive from to create a naked child.
You’re defending the generation of CSAM pretty hard here in some vaguely “but no child we know of” being involved as a defense.
But the AI companies insist the outputs of these models aren’t derivative works in any other circumstances!
Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go “ok, itch scratched”, and tank the demand for the real stuff.
Depending on which way it goes, it could be massively helpful for protecting kids. I just don’t have a sense for what the effect would be, and I’ve never seen any experts weigh in.
Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go “ok, itch scratched”, and tank the demand for the real stuff.
From bits/articles I’ve seen here and there over the years about other things that are kind of in the same category (porn comics with child characters in them, child-shaped sex dolls), the latter seems to be more the case.
I’m reminded of when people were arguing that when Internet porn became widespread, the incidence of rape would go through the roof. And then literally the opposite happened. So…that pushes me toward hypothesizing that the latter is more likely to be the case, as well.
In Australia cartoon child porn is enforced in the same way as actual child porn. Not that it answers your question but it’s interesting.
I’d imagine for your question “it depends”, some people who would have acted on their urges may get their jollies from AI child porn, others who have never considered being pedophiles might find the AI child porn (assuming legal) and realise it’s something they were into.
I guess it may lower the production of real child porn which feels like a good thing. I’d hazard a guess that there are way more child porn viewers than child abusers.
In Australia a 30 year old woman cannot be in the porn industry if she has small breasts. That, and the cartoon ban both seem like overcompensating.
Nothing says “we’re protecting children” like regulating what adult women can do with their bodies.
Conservatives are morons, every time.
I’d like to know what psychologists think about it. My assumption is the former, it escalates their fantasizing about it and makes them more likely to attack a child.
There seems to be no way to conduct that experiment ethically, though.
There’s like a lot of layers to it.
- For some, it might actually work in the opposite direction, especially if paried with the wrong kind of community around it. I used to moderate anime communities, the amount of loli fans wanting to lower the age of consent to 12 or even lower was way too high, but they only called people opposed to loli as “real predators”, because they liked their middle-school tier arguments (which just further polarized the fandom when the culture wars started).
- Even worse might be the more realistic depictions might actually work against that goal, while with (most) loli stuff, at least it’s obvious it’s drawn.
- An often overseen issue is, data laundering. Just call your real CP AI generated, or add some GAI artifacts to your collection. Hungary bans too realistic drawings and paintings of that kind, because people even did that with traditional means, by creating as realistic tracings as possible (the calling CP “artistic nudes” didn’t work out here at least).
In Norway, imagining or describing acts with a 16-year old is CP, but having sex with a 16-year old is perfectly legal
You’re missing the point. They don’t care what’s more or less effective for helping kids. They want to punish people who are different. In this case nobody is really going to step up to defend the guy for obvious reasons. But the motivating concept is the same for conservatives.
Depending on which way it goes, it could be massively helpful for protecting kids
Weeeelll, only until the AI model needs more training material…
Could this be considered a harm reduction strategy?
Not that I think CSAM is good in any way, but if it saves a child would it be worthwhile? Like if these pedos were to use AI images instead of actual CSAM would that be any better?
I’ve read that CSAM sites on the dark web number into the hundreds of thousands. I just wonder if it would be a less harmful thing since it’s such a problem.
Many years ago (about 25) I read an article in a newspaper (idk the name, but it may have been the The Computer Paper, which is archived on line someplace}. This article noted that a study had been commissioned to show that cp access increases child abuse. The study seemed to show the opposite.
Here’s the problem with even AI generated cp: It might lower abuse in the beginning, but with increased access it would ‘normalise’ the perception of such conduct. This would likely increase abuse over time, even involving persons who may not have been so inclined otherwise.
This is all a very complex. A solution isn’t simple. Shunning things in anyway won’t help though, and that seems to be the current most popular way to deal with the issue.
Actual pedophiles (a lot of CSA is abuse of power, not pedophilia - though to be clear fuck abusers either way) have a high rate of suicidal ideation because they think its as fucked up as everyone else. Of course we can’t just say “sure AI material is legal now” but I could imagine a regulated system accessed via doctors akin to how controlled substances work.
People take this firm “kill em all” stance but these people just feel the way they do same as I do towards women or a gay man feels toward men. It just is what it is - we all generally agree gay isnt a choice and this is no different. As long as they dont act on it, I think we should be sympathetic and be open to helping them live a less tortured life.
I’m not 100% saying this is how we do it, but we should be open to exploring the issue instead of full stop demonization.
Dan Savage coined the term “gold star pedophile” in a column years ago, referring to people who acknowledge their attraction to children but never act on it by harming a child or accessing CSAM. I do feel bad for these people because there are no resources to help them. The only way they can access actual therapeutic resources for their condition is by offending and going to jail. If the pedophile goes to a therapist and confesses attraction to children, therapists are mandated reporters and will assume they’re going to act on it. An article I read a few years back interviewed members of an online community of non-offending pedophiles who essentially made their own support group since no one else will help them, and nearly all research on them is from a forensic (criminal) context.
Don’t get me wrong - I think offenders need to be punished for what they do. I unfortunately have a former best friend who has offended. He’s no longer in my life and never will be again. But I think we could prevent offenders from reaching that point and hurting someone if we did more research and found ways to stop them before it happened.
I agree for the most part, particularly that we should be open minded.
Obviously we don’t have much reliable data, which I think is critically important.
The only thing I world add is that, I’m not sure treating a desire for CSAM would be the same as substance abuse. Like “weaning an addict off CSAM” seems like a strange proposition to me.
“Normalized” violent media doesn’t seem to have increased the prevalence of real world violence.
You would think so but you basically are making a patch work version of the illicit actual media so it’s a dark dark gray area for sure.
Generative AI is basically just really overpowered text/image prediction. It fills in the words or pixels that make the most sense based on the data it has been fed, so to get AI generated CSAM…it had to have been fed some amount of CSAM at some point or it had to be heavily manipulated to generate the images in question.
I guess my question is does access to regular porn make people not want to have real sex with another person? Does it ‘scratch the itch’ so to speak? Could they go the rest of their life with only porn to satisfy them?
It depends on the person. I feel like most people would be unsatisfied with only porn, but that’s just anecdotal.
I honestly think ai generated csam isn’t something the world needs to be produced. It’s not contributing to society in any meaningful ways and pedophiles who don’t offend or hurt children need therapy, and the ones who do need jailtime(and therapy, but Im in the US so thats a whole other thing). They don’t ‘need’ porn.
My own personal take is that giving pedophiles csam that’s AI generated is like showing alcohol ads to alcoholics. Or going to the strip club if you’re a sex addict. It’s probably not going to lead to good outcomes.
by the same metric, i wonder why not let convicts murderers and psichopaths work at Slaughterhouses
On the other hand, are people who work at slaughterhouses more likely to be murderers and psychopaths?
Lolicon fans in absolute shambles.