A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.
Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.
The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.
It’s time for the butlerian jihad.
I wonder what the prevalence of this kind of behavior is like in countries that aren’t so weird about sex.
What does this have to do with the other? Where I live nudity isn’t all that uncommon (when compared to the US, for example). But sexually harassing someone with fake porn is whole different issue.
I see a lot of problems with people having trouble understanding consent and struggling to respect other people. Those boys are weird about sex. That’s the weirdness we should address.
My bad, I wasn’t as clear as I could have been. I meant, I wonder if boys would be so weird as to want to make such fake porn in places that are less weird about sex.
Did you think I was advocating for the fake images?
No I thought you meant that being hurt by fake porn about yourself is “being weird about sex”.
reading this, I don’t really know what is supposed to be protected here to be deemed possible of protections in the first place.
closest reasonable one is the girl’s “identity”, so it could be fraud. but it’s not used to fool people. more likely, those getting the pics already consented this is ai generated.
so might be defamation?
the image generation tech is already easily accessible so the girl’s picture being easily accessible might be the weakest link?
Maybe it is just me, but its why I think this is a bigger issue than just Hollywood.
The rights to famous people’s “images” are bought and sold all the time.
I would argue that the entire concept should be made illegal. Others can only use your image with your explicit permission and your image cannot be “owned” by anyone but yourself.
The fact that making a law like this isn’t a priority means this will get worse because we already have a society and laws that don’t respect our rights to control of our own image.
A law like this would also remove all the questions about youth and sex and instead make it a case of misuse of someone else’s image. In this case it could even be considered defamation for altering the image to make it seem like it was real. They defamed her by making it seem like she took nude photos of herself to spread around.
That sounds pretty dystopian to me. Wouldn’t that make filming in public basically illegal?
In Germany it is illegal to make photos or videos of people who are identifieable (faces are seen or closeups) without asking for permission first. With exception of public events, as long as you do not focus on individuals. It doesn’t feel dystopian at all, to be honest. I’d rather have it that way than ending up on someone’s stupid vlog or whatever.
In the end you can’t stop it anymore than you can stop teen boys from wanking. Eventually there will just be fake nudes of everyone so it will have no meaning. It sucks, but it is how it is. Maybe people should get out in front of it by generating there own deep fakes of themsleves, but embellish them some so they have an obvious fakeness and age them up to legal age or something.
Does it suck? A future where people have gotten over feeling ashamed of having bodies sounds pretty cool.
If nudity wasn’t a big deal, it wouldn’t even occur to them to harass girls with fake nudes, and nobody would care if they tried.