Holy shit!
I havent been on fb for a long time, I mean years, so I didn’t realize it had basically turned into the dumping grounds for the rest of the internet.
It reminds me of an article i read a long time ago about the police branches that have to deal with reviewing csam and the toll it takes on them.
I don’t typically have much sympathy for the police, but anyone who has to look at that stuff and basically get psychological damage in order to convict the people who create it have my respect.
Those moderators don’t deserve that shit. I hope they win their lawsuit.
That’s racist
This is what AI should actually be used for. Strengthen the algorithms that identify it to reduce the load humans need to review, and it should hopefully be more manageable
Same Kenyans were probably used to train those AI models.
I’ve already seen discussions on Nazi imagery in media get flagged for promoting Nazis by those systems. And to be clear, the villains were who had the Nazi imagery and the blog was discussing how fascists use charisma.
We’ve also sand dunes get flagged for pornography when tumbr banned 18+ content.
AI flagged my VR controller as a gun on Facebook and my account received a 30 day ban
Look… this is going to sound exceedingly stupid… but shouldn’t they find a way to use convicted sex offenders to filter out CSAM? They are not going to be traumatised by it and it saves some other poor soul from taking the brunt. How do you motivate them to actually do it? Well first one has to flag, and a second one earns a bonus every time the first one flags wrong. Motivation aligned!
</joking… but seriously though…>
There’s actually a lot of logic to the general idea of hiring specific population groups that don’t get traumatised by the content they’re checking. Problem is FB (and other companies in the same vein) can’t be bothered to do anything except hire the lowest bidder.
I hope they win their lawsuit. I listened to a Radio lab episode a few years ago about FB moderators. The shit they have to see day in and day out sound absolutely horrible. Pics and videos of extreme violence and child pornography sounds like it’d give any normal person some major trauma.
The company should be doing more to support these employees, that’s the point. Right now, Meta doesn’t give a fuck if their employees are getting severely traumatized trying to keep content off their platforms. They don’t pay them much, don’t offer resources for mental health, etc. A maybe bad analogy would be like a construction company having no heavy machinery safety policies and when those employees get hurt and can’t work anymore, just firing them with no worker’s comp.
For comparison, hospitals or law enforcement provide therapy and/or other mental health resources for their employees, since those jobs put their employees in potentially traumatic positions with some frequency (e.g. a doctor/nurse witnessing death a lot).
Exposed to a firehose of the worst humanity has to offer. I can’t even imagine
Go on the Reddit front page or Twitter home page with a heart rate and blood pressure monitor and scroll for 10 minutes.
You will find that in almost every instance, both measures go up. The whole point of social media is to agitate because agitation correlates with engagement which correlates with mucho ad dinero.