35 points

Not the company making the CSAM machine though.

permalink
report
reply
2 points

I’m so old the only AI tools we had were drilling holes into the girl’s locker room shower walls.

permalink
report
reply
66 points
*
Deleted by creator
permalink
report
reply
2 points
*

I sometimes think AI is code for something of a normal computing capacity, but with the ability to decentralize and create a market of buzzword style shovelware pump and dumps.

permalink
report
parent
reply
13 points

Would the impact have been any different if they had used photoshop?

permalink
report
parent
reply
7 points
*
Deleted by creator
permalink
report
parent
reply
14 points

Since when could someone do this … with a couple of clicks and zero training, for free and on any device?

You know, somebody probably said the exact same thing about Photoshop when it first came out? Back when cutting up photos and pasting them together was a thing. Then again, somebody probably said the same thing about the advent of photography too, during the days of woodcuts and oil paintings.

permalink
report
parent
reply
21 points

We (adults) created a world with knives, guns, automobiles. But if any of these were used by a child for manslaughter would you still blame collective adults? No, parents are held responsible for protecting their children and controlling their access to dangerous tools.

permalink
report
parent
reply
6 points

I’m an adult and am not responsible for anything you described. They were all there even before I was born. In fact, the same may apply to my parents or even grandparents. I’d rather blame a sociopolitical class than any single generation for all those ills.

But to answer your question, yes, I’d blame that entire class for the harm caused by young people using murder tools they introduced. They did it with the full knowledge of its consequences. They valued momentary material gains above the wellbeing of entire generations. They absolutely should be punished for all the mass shootings in schools, because they knew it could happen. Yet they chose the blood money. Similarly, if an entire city is under a drugs epidemic (like the current opioid crisis), wouldn’t you want to hunt down the producers and suppliers, instead of the users?

permalink
report
parent
reply
3 points

I mean I would and do in fact literally blame societal and familial problems when kids are brutal, unkind, or hurt others, and similarly blame societal and familial problems for when kids are not protected from brutal, unkind, and hurtful things.

Why are you saying the things you’re saying like a gotcha? Do you not feel that society has a significant impact on the behavior of youth?

permalink
report
parent
reply
11 points

Yeah. Society is responsible for the outcomes of its children. It shouldn’t personally hurt your feelings but it should motivate your actions.

permalink
report
parent
reply
14 points
*
Deleted by creator
permalink
report
parent
reply
1 point

Knives and car keys are accessible. I don’t know about porn AI tools, i haven’t seen any myself. The AI tools i did see have guard rails that make improper use impossible.

permalink
report
parent
reply
2 points

I haven’t seen a single comment like you described

permalink
report
parent
reply
6 points
*
Deleted by creator
permalink
report
parent
reply

I am so lucky I wasn’t in school when AI was around.

permalink
report
reply
6 points

People used to say that about social media

permalink
report
parent
reply
3 points

🤖 I’m a bot that provides automatic summaries for articles:

Click here to see the summary

In addition to probation, the teens will also be required to attend classes on gender and equality, as well as on the “responsible use of information and communication technologies,” a press release from the Juvenile Court of Badajoz said.

In addition to mental health impacts, victims have reported losing trust in classmates who targeted them and wanting to switch schools to avoid further contact with harassers.

Minors targeting classmates may not realize exactly how far images can potentially spread when generating fake child sex abuse materials (CSAM); they could even end up on the dark web.

An investigation by the United Kingdom-based Internet Watch Foundation (IWF) last year reported that “20,254 AI-generated images were found to have been posted to one dark web CSAM forum in a one-month period,” with more than half determined most likely to be criminal.

While lawmakers struggle to apply existing protections against CSAM to AI-generated images or to update laws to explicitly prosecute the offense, other more drastic solutions to prevent the harmful spread of deepfakes have been proposed.

Ars could not immediately reach Meta for comment on efforts to combat the proliferation of AI-generated CSAM on WhatsApp, the private messaging app that was used to share fake images in Spain.


Saved 72% of original text.

permalink
report
reply

Technology

!technology@beehaw.org

Create post

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

Community stats

  • 2.8K

    Monthly active users

  • 1.8K

    Posts

  • 11K

    Comments