A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.
Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.
I don’t see how children were abused in this case? It’s just AI imagery.
It’s the same as saying that people get killed when you play first person shooter games.
Or that you commit crimes when you play GTA.
It’s just AI imagery.
Fantasising about sexual contact with children indicates that this person might groom children for real, because they have a sexual interest in doing so. As someone who was sexually assaulted as a child, it’s really not something that needs to happen.
indicates that this person might groom children for real
But unless they have already done it, that’s not a crime. People are prosecuted for actions they commit, not their thoughts.
I agree, this line of thinking quickly spirals into Minority Report territory.
Seems like then fantasizing about shooting people or carjacking or such indcates that person might do that activity for real to. There are a lot of car jackings nowadays and you know gta is real popular. mmmm. /s but seriously im not sure your first statement has merit. Especially when you look at where to draw the line. anime. manga. oil paintings. books. thoughts in ones head.
If you’re asking whether anime, manga, oil paintings, and books glorifying the sexualization of children should also be banned, well, yes.
This is not comparable to glorifying violence, because real children are victimized in order to create some of these images, and the fact that it’s impossible to tell makes it even more imperative that all such imagery is banned, because the existence of fakes makes it even harder to identify real victims.
It’s like you know there’s an armed bomb on a street, but somebody else filled the street with fake bombs, because they get off on it or whatever. Maybe you’d say making fake bombs shouldn’t be illegal because they can’t harm anyone. But now suddenly they have made the job of law enforcement exponentially more difficult.
If you want to keep people who fantasise about sexually exploiting children around your family, be my guest. My family tried that, and I was raped. I didn’t like that, and I have drawn my own conclusions.
How was the model trained? Probably using existing CSAM images. Those children are victims. Making derivative images of “imaginary” children doesn’t negate its exploitation of children all the way down.
So no, you are making false equivalence with your video game metaphors.
But the AI companies insist the outputs of these models aren’t derivative works in any other circumstances!
Can you or anyone verify that the model was trained on CSAM?
Besides a LLM doesn’t need to have explicit content to derive from to create a naked child.
You’re defending the generation of CSAM pretty hard here in some vaguely “but no child we know of” being involved as a defense.
A generative AI model doesn’t require the exact thing it creates in its datasets. It most likely just combined regular nudity with a picture of a child.
In that case, the images of children were still used without their permission to create the child porn in question
Well, the image generator had to be trained on something first in order to spit out child porn. While it may be that the training set was solely drawn/rendered images, we don’t know that, and even if the output were in that style, it might very well be photorealistic images generated from real child porn and run through a filter.
An AI that is trained on children and nude adults can infer what a nude child looks like without ever being trained specifically with those images.
Your argument is hypothetical. Real world AI was trained on images of abused childen.
It didn’t generate what we expect and know a corn dog is.
Hence it missed because it doesn’t know what a “corn dog” is
You have proven the point that it couldn’t generate csam without some being present in the training data
Wild corn dogs are an outright plague where I live. When I was younger, me and my buddies would lay snares to catch to corn dogs. When we caught one, we’d roast it over a fire to make popcorn. Corn dog cutlets served with popcorn from the same corn dog is popular meal, especially among the less fortunate. Even though some of the affluent consider it the equivalent to eating rat meat. When me pa got me first rifle when I turned 14, I spent a few days just shooting corn dogs.
Not a great comparison, because unlike withh violent games or movies, you can’t say that there is no danger to anyone in allowing these images to be created or distributed. If they are indistinguishable from the real thing, it then becomes impossible to identify actual human victims.
There’s also a strong argument that the availability of imagery like this only encourages behavioral escalation in people who suffer from the affliction of being a sick fucking pervert pedophile. It’s not methadone for them, as some would argue. It’s just fueling their addiction, not replacing it.
The difference is intent. When you’re playing a FPS, the intent is to play a game. When you play GTA the intent is to play a game.
The intent with AI generated CSAM is to watch kids being abused.
Punishing people for intending to do something is punishing them for thought crimes. That is not the world I want to live in.
This guy did do something - he either created or accessed AI generated CSAM.
Then also every artist creating loli porn would have to be jailed for child pornography.
But this is the US… and its kind of a double standard if you’re not arrested for drawing but for generating it.
I must admit, amount of comments that are defending AI images as not child porn is truly shocking.
In my book, sexual images of children are not okay, AI generated or otherwise. Pedophiles need help, counseling and therapy. Not images that enable something I think is not acceptable in society.
I truly do believe that AI images should be subject to same standards as regular images in what content we deem appropriate or not.
Yes, this can be used to wrongfully prosecute innocent people, but it does not mean that we should freely allow AI-CP.
Pedophiles need help, counseling and therapy. Not images that enable something I think is not acceptable in society.
I mean 30-40 years ago you could replace the word pedophile with homosexual and a vast majority of people would agree. I’m not defending pedophilia here but it’s important to remember these people are born the way they are. Nothing is going to change that, new pedophiles are born every day. They will never go away. The same way you can’t change gay or transgender people. Repressing sexual desire never works look at the priests in the Catholic Church. A healthy outlet such as AI generated porn could save a lot of actual children from harm. I think that should be looked into.
I would like to know what source you have for claiming that pedophiles are “born the way they are.”
We understand some of the genetic and intrauterine developmental reasons for homosexuality, being trans, etc. That has scientific backing, and our understanding continues to grow and expand.
Lumping child predators in with consenting adults smacks of the evangelical slippery slope argument against all forms of what they consider to be “sexual deviance.” I’m not buying it.
Look, I get what you are saying and I do agree. However, I don’t think that comparing pedophilic relations to LGBTQ struggles is fair. One is consented relationship between consenting adults, other is exploitation and high probability of setup for lifelong mental struggles from young age.
the number of people willing to bat for this on Lemmy is truly disturbing. what do they think these ai models are trained on?
No necessarily is trained on CP, could be trained with images of children (already fuck up, who gave them that permission?) and pornography.
Cant speak for others but I agree that AI-CP should be illegal.
The question is how do we define the crime with our current laws? It does seem like we need a new law to address AI images. Both for things like AI-CP, revenge porn, and slanderous/misleading photos. (The Communist Harris and Trump with black people photos)
Where do we draw the line?
How do we regulate it?
Forced watermarks/labels on all tools?
Jail time? Fines?
Forced correction notices? (Doesn’t work for the news!)
This is all a slippery slope but what I can say is I hope this goes to court. He looses. Appeals. Then it goes all the way up to federal so we can have a standard to point to.
The shit wrong.
Step one in fixing shit.
You’re not kidding.
The only possible way I could see a defense if it were something like “AI CSAM results in a proven reduction of actual CSAM”.
But. The defenses aren’t even that!
They’re literally saying that CSAM is okay. I’m guessing a lot of these same comments would argue that deepfakes are okay as well. Just a completely fucked up perspective.
I generally think if something is not causing harm to others, it shouldn’t be illegal. I don’t know if “generated” CSAM causes harm to others though. I looked it up and it appears the research on whether CSAM consumption increases the likelihood of a person committing child abuse is inconclusive.
Honestly, I don’t care if it is AI/not real, I’m glad that the man was arrested. He needs some serious help for sexualising kids.
Chemical castration offers the best success rates, see my comment under OP citing research.
You and I both know he’s not going to get it. I have a kinda sympathy for ppl attracted to kids but refuse to act on it. They clearly know it’s not normal and recognize the absolute life destroying damage they can cause if they act on it. That being said there’s not many places you can go to seek treatment. Any institutions that advertised treatment would have ppl outside with pitchforks and torches.
Before anyone tries to claim I’m pro pedo you can fuck right off. I just wish it was possible for ppl who are attracted to kids and not out touching them to get some kind of therapy and medication to make them normal (or destroy their sex drive) before something terrible happens.
to get some kind of therapy and medication to make them normal
Hi, Psychologist here. Does society have strong evidence that therapeutic interventions are reducing rates of, say, the most common disorders of anxiety and depression? Considering that the rates of these are going up, I don’t think we can assume there’s a hugely successful therapy to help those attracted to CSA images to change. Psychology is not a very good science principally because it offers few extremely effective answers in the real world.
In terms of medication androgen antagonists are generally used. This is because lowering testosterone generally leads to a lower sex drive. Here is an article about those drugs, including an offender who asked for them: https://www.theguardian.com/society/2016/mar/01/what-should-we-do-about-paedophiles
TW: the article contains discussion of whether offenders are even psychologically disordered, when set within a historical cultural context of child-marriage. This paragraph is two above the illustration of people trapped within concentric circular walls, and starts “In the 2013 edition …”.
Collis began to research the treatment and decided that it was essential to his rehabilitation. He believes he was born a paedophile, and that his attraction to children is unchangeable. “I did NOT wake up one morning and decide my sexual preference. I am sexually attracted to little girls and have absolutely no interest in sex with adults. I’ve only ever done stuff with adults in order to fit in with what’s ‘normal’.” For Collis, therefore, it became a question of how to control this desire and render himself incapable of reoffending.
[…]
Many experts support Aaron Collis’s self-assessment, that paedophilia is an unchangeable sexual preference. In a 2012 paper, Seto examined three criteria – age of onset, sexual and romantic behaviour, and stability over time. In a number of studies, a significant proportion of paedophiles admitted to first experiencing attraction towards children before they had reached adulthood themselves. Many described their feelings for children as being driven by emotional need as well as sexual desire. As for stability over time, most clinicians agreed that paedophilia had “a lifelong course”: a true paedophile will always be attracted to children. “I am certainly of the view,” Seto told me, “that paedophilia can be thought of as a sexual orientation.”
Brain-imaging studies have supported this idea. James Cantor, a psychiatry professor at the University of Toronto, has examined hundreds of MRI scans of the brains of paedophiles, and found that they are statistically more likely to be left-handed, shorter than average, and have a significantly lower density of white matter, the brain’s connective tissue. “The point that’s important for society is that paedophilia is in the brain at all, and that the person didn’t choose it,” Cantor told me. “As far as we can tell, they were born with it.” (Not that this, he emphasised, should excuse their crimes.)
[…]
Clinical reality is a little more complicated. “There’s no pretence that the treatment is somehow going to cure them of paedophilia,” Grubin told me. “I think there is an acceptance now that you are not going to be able to change very easily the direction of someone’s arousal.” Grubin estimates that medication is only suitable for about 5% of sex offenders – those who are sexually preoccupied to the extent that they cannot think about anything else, and are not able to control their sexual urges. As Sarah Skett from the NHS put it: “The meds only take you so far. The evidence is clear that the best treatment for sex offending is psychologically based. What the medication does is help people have a little bit of control, which then allows them to access that treatment.”
Some research on success rates:
Prematurely terminating treatment was a strong indicator of committing a new sexual offense. Of interest was the general improvement of success rates over each successive 5-year period for many types of offenders. Unfortunately, failure rates remained comparatively high for rapists (20%) and homosexual pedophiles (16%), regardless of when they were treated over the 25-year period. [https://pubmed.ncbi.nlm.nih.gov/11961909/]
Within the observation period, the general recidivism and sexual recidivism rates were 33.1% and 16.5%, respectively, and the sexual contact recidivism rate was 4.7%. [https://journals.sagepub.com/doi/abs/10.1177/0306624X231165416 - this paper says that suppressing the sex drive with medication was the most successful treatment]
Men with deviant sexual behavior, or paraphilia, are usually treated with psychotherapy, antidepressant drugs, progestins, and antiandrogens, but these treatments are often ineffective. Selective inhibition of pituitary–gonadal function with a long-acting agonist analogue of gonadotropin-releasing hormone may abolish the deviant sexual behavior by reducing testosterone secretion. [https://www.nejm.org/doi/full/10.1056/nejm199802123380702 - this paper supports that lowering testosterone works best]
I don’t understand why we haven’t used inhalable oxytocin as an experimental drug for people attracted to children and animals. It seems intuitive- children and animals generate oxytocin for humans automatically, and it’s possible some people need a stronger stimulus to release oxytocin or may not have a lot of oxytocin endogenously. Oxytocin can be compounded at a pharmacy and has been used successfully for social anxiety.
Thank you for such a well laid out response and the research to back it up. I rarely see people approaching the subjects of pedophilia, and how best to treat pedophiles, rationally and analytically.
It’s understandable considering the harm they can cause to society that most can only ever view them as nothing more or less than monsters, and indeed, those that are incapable of comprehending the harm they cause and empathizing with those they could potentially cause or have caused harm to, are IMHO some of the more loathsome individuals.
That said, I think too often people are willing to paint others whose proclivities are so alien and antithetical to our own as not only monsters, but monsters that aren’t worth understanding with any degree of nuance, that we ultimately do ourselves and future generations a disservice by not at least attempting to address the issue at hand in the hopes that the most harmful parts of our collective psyche are treated palliatively to the best of our ability.
Your annotated sources indicate that there is not nearly as clear a path forward as detractors to the “pedophiles are simply monsters and there’s no reason to look into their motives further” would like to believe, while also, by the nature of the existence of the attempted treatments themselves, points out that there is more work to be done to hopefully find a more lasting and successful rate of treatment.
Like many of the psychological ailments plagueing societies today, you cannot simply kill and imprison the problem away. That is always a short term (albeit at times temporarily effective) solution. The solution to the problem of how to greatly reduce the occurrence of pedophilia will ultimately require more of this kind of research and will require more analysis and study towards achieving such ends.
Again, I thank you for your nuanced post, and commend you for taking your nuanced stance as well.
If no children were involved in the production of porn, how is it pedophilic? That’s like claiming a picture of water has the same properties as water.
It’s pedophillic because it’s sexual images of children; fake or not does not change that. Drawn images of child pornography are still pedophillic.
The more important question is, is it CSAM? Whether drawn images that represent no real child are or not depends on the legal jurisdiction. Should drawn and AI generated child porn be legal or banned? I think the actual answer to that would require significant research into whether its existence acts as an outlet to prevent pedophiles from harming actual children, or whether it encourages their proclivities and makes them more likely to hurt actual children.
Preventing harm to children should be the goal, but actual research of the effects of simulated child porn vis-a-vis pedophiles harming actual children is as unlikely to happen as any other research into pedophilia
However, a picture of water makes me thirsty. But then again, there is no substitute for water.
I am not defending pedos, or defending Florida for doing something like that.
It’s not really children on these pics. We can’t condemn people for things that are not illegal yet
It’s Florida. They will simply book him and then present him a deal for “only x years prison”, which he’ll take and therefore prevent this from going to court and actually be ruled upon.
I’ve always wondered the same when an adult cop pretends to be a kid only to catch pedos. Couldn’t a lawyer argue that because there actually wasn’t a child, there wasn’t a crime?
I’d like to watch that court case. “I knew it was an old cop and wanted to fuck him.”
There was an episode of “the boondocks” where that happened. https://youtu.be/3d200DatLtU?si=Uu2Jt-RlWOEo4juG .