From the article:

This chatbot experiment reveals that, contrary to popular belief, many conspiracy thinkers aren’t ‘too far gone’ to reconsider their convictions and change their minds.

11 points

At first glance the major takeaway here might be that AI can do gish-gallop but with the truth instead of lies.

And it doesn’t get exhausted with somebody’s bad faith bullshit.

permalink
report
reply
17 points

More like LLMs are just another type of propaganda. The only thing that can effectively retool conspiracy thinkers is a better education with a focus on developing critical thinking skills.

permalink
report
reply
3 points

That’s just what the machines want you to believe.

permalink
report
reply
13 points

All of this can be mitigated much more by ensuring each citizen has a decent education by modern standards. Turns out most of our problems can be fixed by helping each other.

permalink
report
reply
6 points

“Great! Billy doesn’t believe 9/11 was an inside job, but now the AI made him believe Bush was actually president in 1942 and that Obama was never president.”

In all seriousness I think an “unbiased” AI might be one of the few ways to reach people about this stuff because any Joe schmoe is just viewed as “believing what they want you to believe!” when they try to confront any conspiracy.

permalink
report
reply
5 points

With the inherent biases present in any LLM training model, the issue of hallucinations that you’ve brought up, alongside the cost of running an LLM at scale being prohibitive to anyone besides private-state partnerships, do you think that will allay conspiracists’ valid concerns about the centralization of information access, a la the reduction in quality google search results over the past decade and a half?

permalink
report
parent
reply
3 points

I think those people might not, but I was once a “conspiracy nut,” had a circle of friends who were as well, and know that for a lot of those kinds of people YouTube is the majority of the “research” they do. For those people I think this could work as long as it’s not hallucinating and can point to proper sources.

permalink
report
parent
reply

science

!science@lemmy.world

Create post

just science related topics. please contribute

note: clickbait sources/headlines aren’t liked generally. I’ve posted crap sources and later deleted or edit to improve after complaints. whoops, sry

Rule 1) Be kind.

lemmy.world rules: https://mastodon.world/about

I don’t screen everything, lrn2scroll

Community stats

  • 4K

    Monthly active users

  • 470

    Posts

  • 3.3K

    Comments