You are viewing a single thread.
View all comments View context
1 point

More like large guessing models. They have no thought process, they just produce words.

permalink
report
parent
reply
2 points

They don’t even guess. Guessing would imply them understanding what you’re talking about. They only think about the language, not the concepts. It’s the practical embodiment of the Chinese room thought experiment. They generate a response based on the symbols, but not the ideas the symbols represent.

permalink
report
parent
reply
1 point

I’m equating probability with guessing here, but yes there is a nuanced difference.

permalink
report
parent
reply