13 points
*

Holy shit!

I havent been on fb for a long time, I mean years, so I didn’t realize it had basically turned into the dumping grounds for the rest of the internet.

It reminds me of an article i read a long time ago about the police branches that have to deal with reviewing csam and the toll it takes on them.

I don’t typically have much sympathy for the police, but anyone who has to look at that stuff and basically get psychological damage in order to convict the people who create it have my respect.

Those moderators don’t deserve that shit. I hope they win their lawsuit.

permalink
report
reply
-22 points

That’s racist

permalink
report
reply
46 points

This is what AI should actually be used for. Strengthen the algorithms that identify it to reduce the load humans need to review, and it should hopefully be more manageable

permalink
report
reply
8 points
5 points

I’ve already seen discussions on Nazi imagery in media get flagged for promoting Nazis by those systems. And to be clear, the villains were who had the Nazi imagery and the blog was discussing how fascists use charisma.

We’ve also sand dunes get flagged for pornography when tumbr banned 18+ content.

permalink
report
parent
reply
32 points

AI flagged my VR controller as a gun on Facebook and my account received a 30 day ban

permalink
report
parent
reply
8 points

“we need more AI”. It’s like mate, we need intelligence before we even attempt to make it artificial. We’re so fucked. AIs the perfect tool for mass retardation.

permalink
report
parent
reply
-8 points

Look… this is going to sound exceedingly stupid… but shouldn’t they find a way to use convicted sex offenders to filter out CSAM? They are not going to be traumatised by it and it saves some other poor soul from taking the brunt. How do you motivate them to actually do it? Well first one has to flag, and a second one earns a bonus every time the first one flags wrong. Motivation aligned!

</joking… but seriously though…>

permalink
report
reply
4 points

There’s actually a lot of logic to the general idea of hiring specific population groups that don’t get traumatised by the content they’re checking. Problem is FB (and other companies in the same vein) can’t be bothered to do anything except hire the lowest bidder.

permalink
report
parent
reply
1 point

Come on, many 4channers would do it for free, just for entertainment, since most of them are NEET. Maybe give them some food and mattresses to sleep on in the office.

permalink
report
parent
reply
64 points

I hope they win their lawsuit. I listened to a Radio lab episode a few years ago about FB moderators. The shit they have to see day in and day out sound absolutely horrible. Pics and videos of extreme violence and child pornography sounds like it’d give any normal person some major trauma.

permalink
report
reply
3 points

Blows my.mind that people would post CSAM on Facebook of all places

permalink
report
parent
reply
0 points

That’s not fb’s fault though?

permalink
report
parent
reply
11 points
*

The company should be doing more to support these employees, that’s the point. Right now, Meta doesn’t give a fuck if their employees are getting severely traumatized trying to keep content off their platforms. They don’t pay them much, don’t offer resources for mental health, etc. A maybe bad analogy would be like a construction company having no heavy machinery safety policies and when those employees get hurt and can’t work anymore, just firing them with no worker’s comp.

For comparison, hospitals or law enforcement provide therapy and/or other mental health resources for their employees, since those jobs put their employees in potentially traumatic positions with some frequency (e.g. a doctor/nurse witnessing death a lot).

permalink
report
parent
reply
3 points

Yeah you’re right

permalink
report
parent
reply
37 points

Exposed to a firehose of the worst humanity has to offer. I can’t even imagine

permalink
report
parent
reply
8 points

Go on the Reddit front page or Twitter home page with a heart rate and blood pressure monitor and scroll for 10 minutes.

You will find that in almost every instance, both measures go up. The whole point of social media is to agitate because agitation correlates with engagement which correlates with mucho ad dinero.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 14K

    Monthly active users

  • 6.8K

    Posts

  • 156K

    Comments