117 points
Deleted by creator
permalink
report
reply
40 points
*

My personal belief still is that the prohibitive approach is futile and ultimately more harmful than the alternative: embrace the technology, promote it and create deepfakes of everyone.

Soon the taboo will be gone, the appeal as well, and everyone will have plausible deniability too, because if there are dozens of fake nudes of any given person then who is to say which are real, and why does it even matter at that point?

This would be a great opportunity to advance our societal values and morals beyond prudish notions, but instead we double down on them.

E: just to clarify I do not at all want to endorse creating nudity of minors here. Just point out that the girl in the article wouldn’t have to humiliate herself trying to do damage control in the above scenario, because it would be entirely unimportant.

permalink
report
parent
reply
63 points

While I think removing the stigma associated with having deepfakes made of you is important, I don’t think that desensitization through exposure is the way to go about it. That will cause a lot of damage leading up to the point you’re trying to reach.

permalink
report
parent
reply
5 points

I don’t seen how else you do it.

“Removing the stigma” is desensitizing by definition. So you want to desensitize through… what? Education?

permalink
report
parent
reply
-7 points

Eve seen a deep fake nude of someone ugly? People make them because they wanna see you naked. Can’t see how that’s an insult.

permalink
report
parent
reply
23 points

This sounds like a cool idea because it is a novel approach, and it appeals to my general heuristic of the inevitability of technology and freedom. However, I don’t think it’s actually a good idea. People are entitled privacy, on this I hope we agree – and I believe this is because of something more fundamental: people are entitled dignity. If you think we’ll reach a point in this lifetime where it will be too commonplace to be a threat to someone’s dignity, I just don’t agree.

Not saying the solution is to ban the technology though.

permalink
report
parent
reply
16 points

When you put out photos of yourself on the internet you should expect anyone to find them and do whatever they want to them. If you aren’t expecting that, then you aren’t educated enough on how internet works and that’s what we should be working on. Social media is really bad for privacy and many people are not aware of it.

Now if someone took a picture of you and then edited it without your consent, that is a different action and it’s a lot more serious offense.

Either way, deepfakes are just an evolution of something that already existed before and isn’t going away anytime soon.

permalink
report
parent
reply
19 points

It’s also worth noting that too many people put out way too much imagery of themselves online. People have got to start expecting that anything you put out in the public domain becomes public domain.

permalink
report
parent
reply
12 points

I second this motion. People also need to stop posting images of themselves all over the web. Especially their own kids. Parents plastering their kids images all over social media should not be condoned.

And on a related note we need much better sex-education in this country and a much healthier relationship with nudity.

permalink
report
parent
reply
17 points

You don’t turn 18 and magically discover your actions have consequences.

“Not a heavy crime”? I’ll introduce you to Sarah, Marie and Olivia. You can tell them it was just a joke. You can tell them the comments they’ve received as a result are just jokes. The catcalling, mentions that their nipples look awesome, that their pussies look nice, etc are just jokes. All 3 of them are changing schools, 2 are failing their years. Because of something someone else did to them. And you claim it’s not that bad? The fuck is wrong with you?

permalink
report
parent
reply
35 points
Deleted by creator
permalink
report
parent
reply
1 point

I don’t think maturity is an explicit thing in a binary form, i would be ok with the presumption that the age of 18 provides a general expected range of maturity between individuals, it’s when you start to develop your world view and really pick up on the smaller things in life and how they work together to make a functional system.

I think the idea of putting a “line” on it, is wrong, i think it’s better to describe it “this is generally what you expect from this subset”

permalink
report
parent
reply
-10 points

You right his parents have to be punished. They didn’t teach him how to respect other properly.

permalink
report
parent
reply
-19 points
Deleted by creator
permalink
report
parent
reply
18 points
*

Perhaps at least a small portion of the blame for what these girls are going through should be laid upon the society which obstinately teaches that a woman’s worth as a person is so inextricably tied to her willingness and ability to maintain the privacy of her areolas and vulva that the mere appearance of having failed in the endeavour is treated as a valid reason to disregard her humanity.

permalink
report
parent
reply
-13 points
Deleted by creator
permalink
report
parent
reply
-1 points

All 3 of them are changing schools, 2 are failing their years. Because of something someone else did to them. And you claim it’s not that bad? The fuck is wrong with you?

and by the time they’re 18 and moving on to college, or whatever they’re probably busy not fucking worrying about whatever happened in high school, because at the end of the day you have two options here:

be a miserable fuck. try to be the least miserable fuck you can, and do something productive.

Generally people pick the second option.

And besides, at the end of the day, it’s literally not real, none of this exists. It’s better than having your nudes leaked. Should we execute children who spread nudes of other children now? That’s a far WORSE crime to be committing, because now that shit is just out there, and it’s almost definitely on the internet, AND IT’S REAL.

Seems to me like you’re unintentionally nullifying the consequences of actual real CSAM material here.

Is my comment a little silly and excessive? Yes, that was my point. It’s satire.

permalink
report
parent
reply
2 points

Victims of trauma dont just forget because time passes. They graduate (or dont) and move on in their lives, but the lingering effects of that traumatic experience shape the way the look at the worlds, whether they can trust, body disphoria, whether they can form long-lasting relationships, and other long last trauma responses. Time does not heal the wounds of trauma, they remain as scars that stay vulnerable forever (unless deliberate action is taken by the victim to dismantle the cognitive structure formed by the trauma event).

permalink
report
parent
reply
4 points

Using this idea will give minors feel of complete safety when doing crimes. I don’t think you have any sort of morals if you support it but it’s a question for your local law enforcements. The crime in question can seriously damage the mental health of the vuctim and be a reason for severe discrimination. Older minors should be responsible for their actions too.

permalink
report
parent
reply
-6 points

This society is truly dead.

permalink
report
parent
reply
58 points

“This kid who is not getting any kind of real consequence other than a little bit of probation, and then when he’s 18, his record will be expunged, and he’ll go on with life, and no one will ever really know what happened,” McAdams told CNN.

“If [this law] had been in place at that point, those pictures would have been taken down within 48 hours, and he could be looking at three years in jail…so he would get a punishment for what he actually did,” McAdams told CNN.

There’s a reason kids are tried as kids and their records are expunged when they become adults. Undoing that will just ruin lives without lessening occurrences.

“It’s still so scary as these images are off Snapchat, but that does not mean that they are not on students’ phones, and every day I’ve had to live with the fear of these photos getting brought up resurfacing,” Berry said. “By this bill getting passed, I will no longer have to live in fear knowing that whoever does bring these images up will be punished.”

This week, Republican Senator Ted Cruz, Democratic Senator Amy Klobuchar and several colleagues co-sponsored a bill that would require social media companies to take down deep-fake pornography within two days of getting a report.

“[The bill] puts a legal obligation on the big tech companies to take it down, to remove the images when the victim or the victim’s family asks for it,” Cruz said. “Elliston’s Mom went to Snapchat over and over and over again, and Snapchat just said, ‘Go jump in a lake.’ They just ignored them for eight months.”

BS

It’s been possible for decades for people to share embarrassing pictures of you, real or fake, on the internet. Deep fake technology is only really necessary for video.

Real or fake pornography including unwilling participants (revenge porn) is already illegal and already taken down, and because the girl is underage it’s extra illegal.

Besides the legal aspect, the content described in the article, which may be an exaggeration of the actual content, is clearly in violation of Snapchat’s rules and would have been taken down:

  • We prohibit any activity that involves sexual exploitation or abuse of a minor, including sharing child sexual exploitation or abuse imagery, grooming, or sexual extortion (sextortion), or the sexualization of children. We report all identified instances of child sexual exploitation to authorities, including attempts to engage in such conduct. Never post, save, send, forward, distribute, or ask for nude or sexually explicit content involving anyone under the age of 18 (this includes sending or saving such images of yourself).
  • We prohibit promoting, distributing, or sharing pornographic content, as well as commercial activities that relate to pornography or sexual interactions (whether online or offline).
  • We prohibit bullying or harassment of any kind. This extends to all forms of sexual harassment, including sending unwanted sexually explicit, suggestive, or nude images to other users. If someone blocks you, you may not contact them from another Snapchat account.
permalink
report
reply
6 points

Is revenge porn illegal federally? Not that that would really matter, a state could still not have a law and have no way to prosecute it.

Given there was some state that recently passed a revenge porn law makes it clear your just wrong

On Snapchats ToS: Lucky never ran into the first point personally but as a teenager I heard about it happening quite a bit.

The second point is literally not enforced at all, to the point where they recommend some sort of private Snapchats which are literally just porn made by models

Don’t know how well they enforce the last point

permalink
report
parent
reply
22 points

I looked it up before posting. It’s illegal in 48 states, including California where most of these companies are headquartered, and every state where major cloud data centers are located. This makes it effectively illegal by state laws, which is the worst kind of illegal in the United States when operating a service at a national level because every state will have slightly different laws. No company is going to establish a system that allows users in the two remaining states to exchange revenge porn with each other except maybe a website established solely for that purpose. Certainly Snapchat would not.

I’ve noticed recently there are many reactionary laws to make illegal specific things that are already illegal or should already be illegal because of a more general law. We’d be much better off with a federal standardization of revenge porn laws than a federal law that specifically outlaws essentially the same thing but only when a specific technology is involved.

permalink
report
parent
reply
42 points
*

permalink
report
reply
31 points
*

I apologize for the innappropriate behavior and bans by @TheAnonymouseJoker@lemmy.ml in this thread, I’ve removed them as a mod here, banned them, and unbanned the ppl who they innappropriately banned.

Note: if they get unbanned in the near future, its because of our consensus procedure which requires us admins to take a vote.

permalink
report
reply
6 points

Appreciate it.

permalink
report
parent
reply
5 points

Thank you

They have another account on lemmygrad: https://lemmygrad.ml/u/TheAnonymouseJoker

Anyone around that knows the admins there so they take a look too?

permalink
report
parent
reply
24 points

Odd that there is no mention of the parents contacting the police and working through them to get the images down Technically and legally the photos would be considered child porn Since it’s over the Internet it would bring Federal charges even though there maybe State charges Somethings were handled wrong if all the kid is getting is probation

permalink
report
reply
22 points
*

photos

They aren’t photos. They’re photorealistic drawings done by computer algorithms. This might seem like a tiny quibble to many, but as far as I can tell it is the crux of the entire issue.

There isn’t any actual private information about the girls being disclosed. The algorithms, for example, do not and could not know about and produce an unseen birthmark, mole, tattoo, piercing, etc. A photograph would have that information. What is being shown is an approxomation of what similar looking girls in the training set look like, with the girls’ faces stiched on top. That is categorically different than something like revenge porn which is purely private information specific to the individual.

I’m sure it doesn’t feel all that different to the girls in the pics, or to the boys looking at it for that matter. There is some degree of harm here without question. But we must tread lightly because there is real danger in categorizing algorithmic guesswork as reliable which many authoritarian types are desperate to do.

https://www.wired.com/story/parabon-nanolabs-dna-face-models-police-facial-recognition/

This is the other side of the same coin. We cannot start treating the output of neural networks as facts. These are error prone black-boxes and that fact must be driven hard into the consciousness of every living person.

For some, I’m sure purely unrelated reason, I feel like reading Phillip K Dick again…

permalink
report
parent
reply
4 points

They aren’t photos. They’re photorealistic drawings done by computer algorithms. This might seem like a tiny quibble to many, but as far as I can tell it is the crux of the entire issue.

most phone cameras alter the original image with AI shit now, it’s really common, they apply all kinds of weird correction to make it look better. Plus if it’s social media there’s probably a filter somewhere in there. At what point does this become the ship of thesseus?

my point here, is that if we’re arguing that AI images are semantically, not photos, than most photos on the internet including people would also arguably, not be photos to some degree.

permalink
report
parent
reply
5 points

The difference is that a manipulated photo starts with a photo. It actually contains recorded information about the subject. Deepfakes do not contain any recorded information about the subject unless that subject is also in the training set.

Yes it is semantics, it’s the reason why we have different words for photography and drawing and they are not interchangeable.

permalink
report
parent
reply
2 points

I’ve only read do androids dream of electric sheep by him, what other book(s) should I check out by him?

permalink
report
parent
reply
2 points

Androids/sheep was so good

permalink
report
parent
reply
1 point

Whether or not you consider them photos, DOJ considers them child porn and you will still go to jail.

permalink
report
parent
reply
0 points
Deleted by creator
permalink
report
parent
reply
-5 points

Technically and legally the photos would be considered child porn

I don’t think that has been tested in court. It would be a reasonable legal argument to say that the image isn’t a photo of anyone. It doesn’t depict reality, so it can’t depict anyone.

I think at best you can argue it’s a form of photo manipulation, and the intent is to create a false impression about someone. A form of image based libel, but I don’t think that’s currently a legal concept. It’s also a concept where you would have to protect works of fiction otherwise you’ve just made the visual effects industry illegal if you’re not careful.

In fact, that raises an interesting simily. We do not allow animals to be abused, but we allow images of animal abuse in films as long as they are faked. We allow images of human physical abuse as long as they are faked. Children are often in horror films, and creating the images we see is very strictly managed so that the child actor is not exposed to anything that could distress them. The resulting “works of art” are not under such limitations as far as I’m aware.

What’s the line here? Parental consent? I think that could lead to some very concerning outcomes. We all know abusive parents exist.

I say all of this, not because I want to defend anyone, but because I think we’re about to set some really bad legal precidents if we’re not careful. Ones that will potentially do a lot of harm. Personally, I don’t think the concept of any image, or any other piece of data, being illegal holds water. Police people’s actions, not data.

permalink
report
parent
reply
20 points
*
Removed by mod
permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply

Technology

!technology@lemmy.ml

Create post

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

Community stats

  • 2.1K

    Monthly active users

  • 1.3K

    Posts

  • 8.4K

    Comments

Community moderators