You are viewing a single thread.
View all comments
0 points

I’ve seen this with gpt4. If I ask it to proofread text with errors it consistently does a great job, but if I prompt it to proofread a text without errors, it hallucinates them. It’s funny to see Microsoft having the same issue.

permalink
report
reply
0 points

To be fair, this is how humans do it, too.

permalink
report
parent
reply