1 point
More like large guessing models. They have no thought process, they just produce words.
2 points
They don’t even guess. Guessing would imply them understanding what you’re talking about. They only think about the language, not the concepts. It’s the practical embodiment of the Chinese room thought experiment. They generate a response based on the symbols, but not the ideas the symbols represent.
1 point