You are viewing a single thread.
View all comments
13 points

I mean LLMs can and will produce completely nonsensical outputs. It’s less of AI and more like a bad text prediction

permalink
report
reply
13 points

Regurgitation machine prone to hallucinations is my go-to for explaining what LLMs really are.

permalink
report
parent
reply
1 point

will they still be like that in ten years?

permalink
report
parent
reply
5 points
*

I heard them described as bullshiting machines. The have no concept of, or regard for truth or lies, and just spout whatever sounds good. Much of the time it’s true. Too often it’s not. Sometimes it’s hard to tell the difference.

permalink
report
parent
reply
19 points

Yeah, but the point of the post is to highlight bias - and if there’s one thing an LLM has, it’s bias. I mean that literally: considering their probabilistic nature, it could be said that the only thing an LLM consists of is bias to certain words given other words. (The weights, to oversimplify)

permalink
report
parent
reply

science

!science@lemmy.world

Create post

A community to post scientific articles, news, and civil discussion.

rule #1: be kind

<— rules currently under construction, see current pinned post.

2024-11-11

Community stats

  • 3.8K

    Monthly active users

  • 607

    Posts

  • 5.1K

    Comments