This is the best summary I could come up with:
The teen’s phone was flooded with calls and texts telling her that someone had shared fake nude images of her on Snapchat and other social media platforms.
Berry, now 15, is calling on lawmakers to write criminal penalties into law for perpetrators to protect future victims of deepfake images.
“This kid who is not getting any kind of real consequence other than a little bit of probation, and then when he’s 18, his record will be expunged, and he’ll go on with life, and no one will ever really know what happened,” McAdams told CNN.
The mom and daughter say legislation is essential to protecting future victims, and could have meant more serious consequences for the classmate who shared the deep-fakes.
“If [this law] had been in place at that point, those pictures would have been taken down within 48 hours, and he could be looking at three years in jail…so he would get a punishment for what he actually did,” McAdams told CNN.
“It’s still so scary as these images are off Snapchat, but that does not mean that they are not on students’ phones, and every day I’ve had to live with the fear of these photos getting brought up resurfacing,” Berry said.
The original article contains 585 words, the summary contains 205 words. Saved 65%. I’m a bot and I’m open source!
That’s all well and good to remove them, but it solves nothing. At this point every easily accessible AI I’m aware of is kicking back any prompts with the names of real life people, they’re already antisipating real laws, preventing the images from being made in the first place isn’t impossible.
If you really must, you can simply have the AI auto delete nsfw images, several already do this. Now to argue you can’t simply never generate or give out nsfw images, you can also gate nsfw content generation behind any number of hinderences that are highly effective against anonymous use, or underage use.
Modern AI is not capable of this. The accuracy for detecting nsfw content is not good, and they are completely incapable of detecting when nsfw content is allowable because they have no morals and they don’t understand anything about people or situations besides appearance.
Agreed. To me, making them is one thing, it’s like making a drawing at home. Is it moral? Not really. Should it be illegal? I don’t think so.
Now, what this kid did, distributing them? Absolutely not okay. At that point it’s not private, and you could hurt their own reputation.
This of course ignores the whole fact that she’s underage, which is on its own wrong. AI generated csam is still csam.
A friend in high school made nude drawings of another mutual friend. It was weird he showed me but he was generally an artsy guy and I knew he was REALLY into this girl and it was kind of in the context of showing he his art work. I reconnected with the girl years later and talked about this and while she said it was weird she didn’t really think much of it. Rather, the creepy part to her was that he showed people.
I don’t think we can stop horny teens from making horny content about their classmates, heck, I know multiple girls who wrote erotic stories featuring classmates. The sharing (and realism) is what turns the creepy but kind of understandable teenage behavior into something we need to deal with
All I’m hearing is jailtime for Tina Belcher and her erotic friend fiction!
But seriously, i generally agree that as long as people aren’t sharing it shouldn’t be a problem. If I can picture it in my head without consequence, seems kinda silly putting that thought on paper/screen should be illegal.
AI generated csam is still csam.
Idk, with real people the determination on if someone is underage is based on their age and not their physical appearance. There are people who look unnaturally young that could legally do porn, and underage people who look much older but aren’t allowed. It’s not about their appearance, but how old they are.
With drawn or AI-generated CSAM, how would you draw that line of what’s fine and what’s a major crime with lifelong repercussions? There’s not an actual age to use, the images aren’t real, so how do you determine the legal age? Do you do a physical developmental point scale and pick a value that’s developed enough? Do you have a committee where they just say “yeah, looks kinda young to me” and convict someone for child pornography?
To be clear I’m not trying to defend these people, but it seems like trying to determine what counts legal/non-legal for fake images seems like a legal nightmare. I’m sure there are cases where this would be more clear cut (if they ai generate with a specific age, trying to do deep fakes of a specific person, etc), but a lot of it seems really murky when you try to imagine how to actually prosecute over it.
all good points, and I’ll for sure say that I’m not qualified enough to be able to answer that. I also don’t think politicians or moms groups or anyone are.
All I’ll do is muddy the waters more. We as the vast majority of humanity think CSAM is sick, and those who consume it are not healthy. I’ve read that psychologists are split. Some think AI generated CSAM is bad, illegal, and only makes those who consume it worse. Others, however, suggest that it may actually curb urges, and ask why not let them generate it, it might actually reduce real children from being actually harmed.
I personally have no idea, and again am not qualified to answer those questions, but goddamn did AI really just barge in without us being ready for it. Fucking big tech again. “I’m sure society will figure it out”
The current method is auto deleting nsfw images. Doesn’t matter how you got there, it detects nsfw it dumps it, you never get an image. Besides that gating nsfw content generation behind a pay wall or ID wall. It would stop a lot of teenagers. Not all, but it would put a dent in it. There are also AI models that will allow some nsfw if it’s clearly in an artistic style, like a watercolor painting, but will kick nsfw realism or photography, rendered images, that sort of thing. These are usually both in the prompt mode, paint in/out, and image reference mode, generation of likely nsfw images, and after generating a nsfw check before delivering the image. AI services are antisipating full on legal consequences for allowing any nsfw or any realistic, photographic, cgi, image of a living person without their consent, it’s easy to see that’s what they are prepared for.
Web services and AI in general are completely different things. Web services that generate AI content want to avoid scandals so they’re constantly blocking things that may be in some situations inappropriate, to the point where those services are incapable of performing a great many legitimate tasks.
Somebody running their own image generator on their own computer using the same technology is limited only by their own morals. They can train the generator on content that public services would not, and they are not constrained by prompt or output filters.
My personal belief still is that the prohibitive approach is futile and ultimately more harmful than the alternative: embrace the technology, promote it and create deepfakes of everyone.
Soon the taboo will be gone, the appeal as well, and everyone will have plausible deniability too, because if there are dozens of fake nudes of any given person then who is to say which are real, and why does it even matter at that point?
This would be a great opportunity to advance our societal values and morals beyond prudish notions, but instead we double down on them.
E: just to clarify I do not at all want to endorse creating nudity of minors here. Just point out that the girl in the article wouldn’t have to humiliate herself trying to do damage control in the above scenario, because it would be entirely unimportant.
While I think removing the stigma associated with having deepfakes made of you is important, I don’t think that desensitization through exposure is the way to go about it. That will cause a lot of damage leading up to the point you’re trying to reach.
I don’t seen how else you do it.
“Removing the stigma” is desensitizing by definition. So you want to desensitize through… what? Education?
This sounds like a cool idea because it is a novel approach, and it appeals to my general heuristic of the inevitability of technology and freedom. However, I don’t think it’s actually a good idea. People are entitled privacy, on this I hope we agree – and I believe this is because of something more fundamental: people are entitled dignity. If you think we’ll reach a point in this lifetime where it will be too commonplace to be a threat to someone’s dignity, I just don’t agree.
Not saying the solution is to ban the technology though.
When you put out photos of yourself on the internet you should expect anyone to find them and do whatever they want to them. If you aren’t expecting that, then you aren’t educated enough on how internet works and that’s what we should be working on. Social media is really bad for privacy and many people are not aware of it.
Now if someone took a picture of you and then edited it without your consent, that is a different action and it’s a lot more serious offense.
Either way, deepfakes are just an evolution of something that already existed before and isn’t going away anytime soon.
I second this motion. People also need to stop posting images of themselves all over the web. Especially their own kids. Parents plastering their kids images all over social media should not be condoned.
And on a related note we need much better sex-education in this country and a much healthier relationship with nudity.
Using this idea will give minors feel of complete safety when doing crimes. I don’t think you have any sort of morals if you support it but it’s a question for your local law enforcements. The crime in question can seriously damage the mental health of the vuctim and be a reason for severe discrimination. Older minors should be responsible for their actions too.
You don’t turn 18 and magically discover your actions have consequences.
“Not a heavy crime”? I’ll introduce you to Sarah, Marie and Olivia. You can tell them it was just a joke. You can tell them the comments they’ve received as a result are just jokes. The catcalling, mentions that their nipples look awesome, that their pussies look nice, etc are just jokes. All 3 of them are changing schools, 2 are failing their years. Because of something someone else did to them. And you claim it’s not that bad? The fuck is wrong with you?
You right his parents have to be punished. They didn’t teach him how to respect other properly.
I don’t think maturity is an explicit thing in a binary form, i would be ok with the presumption that the age of 18 provides a general expected range of maturity between individuals, it’s when you start to develop your world view and really pick up on the smaller things in life and how they work together to make a functional system.
I think the idea of putting a “line” on it, is wrong, i think it’s better to describe it “this is generally what you expect from this subset”
Perhaps at least a small portion of the blame for what these girls are going through should be laid upon the society which obstinately teaches that a woman’s worth as a person is so inextricably tied to her willingness and ability to maintain the privacy of her areolas and vulva that the mere appearance of having failed in the endeavour is treated as a valid reason to disregard her humanity.
All 3 of them are changing schools, 2 are failing their years. Because of something someone else did to them. And you claim it’s not that bad? The fuck is wrong with you?
and by the time they’re 18 and moving on to college, or whatever they’re probably busy not fucking worrying about whatever happened in high school, because at the end of the day you have two options here:
be a miserable fuck. try to be the least miserable fuck you can, and do something productive.
Generally people pick the second option.
And besides, at the end of the day, it’s literally not real, none of this exists. It’s better than having your nudes leaked. Should we execute children who spread nudes of other children now? That’s a far WORSE crime to be committing, because now that shit is just out there, and it’s almost definitely on the internet, AND IT’S REAL.
Seems to me like you’re unintentionally nullifying the consequences of actual real CSAM material here.
Is my comment a little silly and excessive? Yes, that was my point. It’s satire.
Victims of trauma dont just forget because time passes. They graduate (or dont) and move on in their lives, but the lingering effects of that traumatic experience shape the way the look at the worlds, whether they can trust, body disphoria, whether they can form long-lasting relationships, and other long last trauma responses. Time does not heal the wounds of trauma, they remain as scars that stay vulnerable forever (unless deliberate action is taken by the victim to dismantle the cognitive structure formed by the trauma event).
“This kid who is not getting any kind of real consequence other than a little bit of probation, and then when he’s 18, his record will be expunged, and he’ll go on with life, and no one will ever really know what happened,” McAdams told CNN.
“If [this law] had been in place at that point, those pictures would have been taken down within 48 hours, and he could be looking at three years in jail…so he would get a punishment for what he actually did,” McAdams told CNN.
There’s a reason kids are tried as kids and their records are expunged when they become adults. Undoing that will just ruin lives without lessening occurrences.
“It’s still so scary as these images are off Snapchat, but that does not mean that they are not on students’ phones, and every day I’ve had to live with the fear of these photos getting brought up resurfacing,” Berry said. “By this bill getting passed, I will no longer have to live in fear knowing that whoever does bring these images up will be punished.”
This week, Republican Senator Ted Cruz, Democratic Senator Amy Klobuchar and several colleagues co-sponsored a bill that would require social media companies to take down deep-fake pornography within two days of getting a report.
“[The bill] puts a legal obligation on the big tech companies to take it down, to remove the images when the victim or the victim’s family asks for it,” Cruz said. “Elliston’s Mom went to Snapchat over and over and over again, and Snapchat just said, ‘Go jump in a lake.’ They just ignored them for eight months.”
BS
It’s been possible for decades for people to share embarrassing pictures of you, real or fake, on the internet. Deep fake technology is only really necessary for video.
Real or fake pornography including unwilling participants (revenge porn) is already illegal and already taken down, and because the girl is underage it’s extra illegal.
Besides the legal aspect, the content described in the article, which may be an exaggeration of the actual content, is clearly in violation of Snapchat’s rules and would have been taken down:
- We prohibit any activity that involves sexual exploitation or abuse of a minor, including sharing child sexual exploitation or abuse imagery, grooming, or sexual extortion (sextortion), or the sexualization of children. We report all identified instances of child sexual exploitation to authorities, including attempts to engage in such conduct. Never post, save, send, forward, distribute, or ask for nude or sexually explicit content involving anyone under the age of 18 (this includes sending or saving such images of yourself).
- We prohibit promoting, distributing, or sharing pornographic content, as well as commercial activities that relate to pornography or sexual interactions (whether online or offline).
- We prohibit bullying or harassment of any kind. This extends to all forms of sexual harassment, including sending unwanted sexually explicit, suggestive, or nude images to other users. If someone blocks you, you may not contact them from another Snapchat account.
Is revenge porn illegal federally? Not that that would really matter, a state could still not have a law and have no way to prosecute it.
Given there was some state that recently passed a revenge porn law makes it clear your just wrong
On Snapchats ToS: Lucky never ran into the first point personally but as a teenager I heard about it happening quite a bit.
The second point is literally not enforced at all, to the point where they recommend some sort of private Snapchats which are literally just porn made by models
Don’t know how well they enforce the last point
I looked it up before posting. It’s illegal in 48 states, including California where most of these companies are headquartered, and every state where major cloud data centers are located. This makes it effectively illegal by state laws, which is the worst kind of illegal in the United States when operating a service at a national level because every state will have slightly different laws. No company is going to establish a system that allows users in the two remaining states to exchange revenge porn with each other except maybe a website established solely for that purpose. Certainly Snapchat would not.
I’ve noticed recently there are many reactionary laws to make illegal specific things that are already illegal or should already be illegal because of a more general law. We’d be much better off with a federal standardization of revenge porn laws than a federal law that specifically outlaws essentially the same thing but only when a specific technology is involved.
It’s not been reported on much because it doesn’t work that well. It’s not as easy as they want you to believe it is. I’m pretty sure most of the “promotional material” has been photoshopped or cherry picked at best
It’s not as easy as they want you to believe it is. I’m pretty sure most of the “promotional material” has been photoshopped or cherry picked at best
absolutely, all of the material out there for marketing is digitally manipulated by a human to some degree. And if it isn’t then honestly, i don’t know what you’re using AI image generation for lmao.