You are viewing a single thread.
View all comments
59 points

These types of errors happen even after including prompts like “Do not hallucinate.”

Genius! Why didn’t I think of that!

permalink
report
reply
40 points
*

In every RAG guide I’ve seen, the suggested system prompts always tended to include some more dignified variation of “Please for the love of god only and exclusively use the contents of the retrieved text to answer the user’s question, I am literally on my knees begging you.”

Also, if reddit is any indication, a lot of people actually think that’s all it takes and that the hallucination stuff is just people using LLMs wrong. I mean, it would be insane to pour so much money into something so obviously fundamentally flawed, right?

permalink
report
parent
reply
8 points

Yeah that method is clearly flawed. Not enough incense and prayers to the Machine God, no wonder the Machine Spirit is displeased. All praise the machine god of Mars! Praise the Omnissiah!

permalink
report
parent
reply
19 points

I’ve tried screaming “stop overfilling my hboxes” when compiling my TeX document, but it isn’t working! Am I prompting it wrong?

permalink
report
parent
reply
8 points

yes. perhaps try more emotional guilt-tripping?

permalink
report
parent
reply

TechTakes

!techtakes@awful.systems

Create post

Big brain tech dude got yet another clueless take over at HackerNews etc? Here’s the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

Community stats

  • 1.5K

    Monthly active users

  • 418

    Posts

  • 11K

    Comments

Community moderators