You are viewing a single thread.
View all comments 13 points
I mean LLMs can and will produce completely nonsensical outputs. It’s less of AI and more like a bad text prediction
13 points
Regurgitation machine prone to hallucinations is my go-to for explaining what LLMs really are.
1 point
5 points
*
19 points
Yeah, but the point of the post is to highlight bias - and if there’s one thing an LLM has, it’s bias. I mean that literally: considering their probabilistic nature, it could be said that the only thing an LLM consists of is bias to certain words given other words. (The weights, to oversimplify)