You are viewing a single thread.
View all comments
-11 points
*

That’s all well and good to remove them, but it solves nothing. At this point every easily accessible AI I’m aware of is kicking back any prompts with the names of real life people, they’re already antisipating real laws, preventing the images from being made in the first place isn’t impossible.

permalink
report
reply
1 point

Depending on the AI developers to stop this on their own is a mistake. As is preemptively accepting child porn and deepfakes as inevitable rather than attempting to stop or mitigate it.

permalink
report
parent
reply
2 points

How do you deal with paint-in generation then

permalink
report
parent
reply
-1 points
*

The current method is auto deleting nsfw images. Doesn’t matter how you got there, it detects nsfw it dumps it, you never get an image. Besides that gating nsfw content generation behind a pay wall or ID wall. It would stop a lot of teenagers. Not all, but it would put a dent in it. There are also AI models that will allow some nsfw if it’s clearly in an artistic style, like a watercolor painting, but will kick nsfw realism or photography, rendered images, that sort of thing. These are usually both in the prompt mode, paint in/out, and image reference mode, generation of likely nsfw images, and after generating a nsfw check before delivering the image. AI services are antisipating full on legal consequences for allowing any nsfw or any realistic, photographic, cgi, image of a living person without their consent, it’s easy to see that’s what they are prepared for.

permalink
report
parent
reply
7 points

I assume someone who is currently generating AI porn is running a model locally and not using a service, as there is absolute boat loads of generated hentai getting pised every day?

permalink
report
parent
reply
6 points

Sure for some tools. There are other tools that don’t do that.

Chasing after the tools and services is a waste. Make harassment more clearly defined, go after people that victimize other people.

permalink
report
parent
reply
6 points

Web services and AI in general are completely different things. Web services that generate AI content want to avoid scandals so they’re constantly blocking things that may be in some situations inappropriate, to the point where those services are incapable of performing a great many legitimate tasks.

Somebody running their own image generator on their own computer using the same technology is limited only by their own morals. They can train the generator on content that public services would not, and they are not constrained by prompt or output filters.

permalink
report
parent
reply
6 points

Agreed. To me, making them is one thing, it’s like making a drawing at home. Is it moral? Not really. Should it be illegal? I don’t think so.

Now, what this kid did, distributing them? Absolutely not okay. At that point it’s not private, and you could hurt their own reputation.

This of course ignores the whole fact that she’s underage, which is on its own wrong. AI generated csam is still csam.

permalink
report
parent
reply
2 points
*
Removed by mod
permalink
report
parent
reply
10 points
*

AI generated csam is still csam.

Idk, with real people the determination on if someone is underage is based on their age and not their physical appearance. There are people who look unnaturally young that could legally do porn, and underage people who look much older but aren’t allowed. It’s not about their appearance, but how old they are.

With drawn or AI-generated CSAM, how would you draw that line of what’s fine and what’s a major crime with lifelong repercussions? There’s not an actual age to use, the images aren’t real, so how do you determine the legal age? Do you do a physical developmental point scale and pick a value that’s developed enough? Do you have a committee where they just say “yeah, looks kinda young to me” and convict someone for child pornography?

To be clear I’m not trying to defend these people, but it seems like trying to determine what counts legal/non-legal for fake images seems like a legal nightmare. I’m sure there are cases where this would be more clear cut (if they ai generate with a specific age, trying to do deep fakes of a specific person, etc), but a lot of it seems really murky when you try to imagine how to actually prosecute over it.

permalink
report
parent
reply
6 points

all good points, and I’ll for sure say that I’m not qualified enough to be able to answer that. I also don’t think politicians or moms groups or anyone are.

All I’ll do is muddy the waters more. We as the vast majority of humanity think CSAM is sick, and those who consume it are not healthy. I’ve read that psychologists are split. Some think AI generated CSAM is bad, illegal, and only makes those who consume it worse. Others, however, suggest that it may actually curb urges, and ask why not let them generate it, it might actually reduce real children from being actually harmed.

I personally have no idea, and again am not qualified to answer those questions, but goddamn did AI really just barge in without us being ready for it. Fucking big tech again. “I’m sure society will figure it out”

permalink
report
parent
reply
14 points

A friend in high school made nude drawings of another mutual friend. It was weird he showed me but he was generally an artsy guy and I knew he was REALLY into this girl and it was kind of in the context of showing he his art work. I reconnected with the girl years later and talked about this and while she said it was weird she didn’t really think much of it. Rather, the creepy part to her was that he showed people.

I don’t think we can stop horny teens from making horny content about their classmates, heck, I know multiple girls who wrote erotic stories featuring classmates. The sharing (and realism) is what turns the creepy but kind of understandable teenage behavior into something we need to deal with

permalink
report
parent
reply
9 points

All I’m hearing is jailtime for Tina Belcher and her erotic friend fiction!

But seriously, i generally agree that as long as people aren’t sharing it shouldn’t be a problem. If I can picture it in my head without consequence, seems kinda silly putting that thought on paper/screen should be illegal.

permalink
report
parent
reply
2 points

Reputation matters less than harassment. If these people were describing her body publicly it would be a similar attack.

permalink
report
parent
reply
4 points

How?

permalink
report
parent
reply
-1 points
*

If you really must, you can simply have the AI auto delete nsfw images, several already do this. Now to argue you can’t simply never generate or give out nsfw images, you can also gate nsfw content generation behind any number of hinderences that are highly effective against anonymous use, or underage use.

permalink
report
parent
reply
4 points

Modern AI is not capable of this. The accuracy for detecting nsfw content is not good, and they are completely incapable of detecting when nsfw content is allowable because they have no morals and they don’t understand anything about people or situations besides appearance.

permalink
report
parent
reply

Technology

!technology@lemmy.ml

Create post

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

Community stats

  • 2.1K

    Monthly active users

  • 1.3K

    Posts

  • 8.4K

    Comments

Community moderators