… and neither does the author (or so I believe - I made them both up).

On the other hand, AI is definitely good at creative writing.

26 points
*

More like creative bullshitting.

It seems that Mitchell was simply an astronaut not an engineer.

permalink
report
reply
-13 points

You can trigger hallucinations in today’s versions of LLMs with this kind of questions. Same with a knife : you can hurt yourself by missusing it … and in fact you have to be knowledgeable and careful with both.

permalink
report
reply
12 points

The knife doesn’t insist it won’t hurt you, and you can’t get cut holding the handle. Comparatively, AI insists it is correct, and you can get false information using it as intended.

permalink
report
parent
reply
1 point

I would argue it’s not the AI but the companies (that make the AI) making unattainable promises and misleading people.

permalink
report
parent
reply

Guns don’t kill people. People kill people.

🙄

permalink
report
parent
reply
3 points

Are you suggesting the AI would appear spontaneously without those companies existing?

permalink
report
parent
reply
6 points

And it’s the fault of crazy kids that school shootings happen. And absolutely nothing else.

/s

permalink
report
parent
reply
4 points

can’t wait for gun companies to start advertising their guns as “intelligent” and “highly safe”

permalink
report
parent
reply
7 points

Maybe ChatGPT should find a way to physically harm users when it hallucinates? Maybe then they’d learn.

permalink
report
parent
reply
2 points

Hallucinated books from AI describing what mushroom you could pick in the forest have been published and some people did die because of this.
We have to be careful when using a.i. !

permalink
report
parent
reply
20 points

This is why I never raw dog ChatGPT

permalink
report
reply
9 points

Hallucinations are so strong with this one too… like really bad.

If I can’t already or won’t be able/willing to verify an output, I ain’t usin’ it - not a bad rule I think.

permalink
report
parent
reply
5 points

At least Bing will cite sources, and hell, sometimes they even align with what it said.

permalink
report
parent
reply
3 points

Heh yeah if the titles of webpages from its searches were descriptive enough

Funny that they didn’t have a way to stop at claiming it could browse websites. Last I checked you could paste in something like

https://mainstreamnewswebsite.com/dinosaurs-found-roaming-playground

and it would tell you which species were nibbling the rhododendrons.

…wow still works, gonna make a thread

permalink
report
parent
reply
2 points
*

Clowning

(I’m not smart enough to leverage a model/make a bot like this but they’ve had too long not to close this obvious misinformation hole)

permalink
report
parent
reply
6 points

I never walk away with an “answer” without having it:

  1. Cite the source
  2. Lookup the source
  3. Permlink you to the source page/line as available
  4. Critique the validity of the source.

After all that, still remain skeptical and take the discussion as a starting point to find your own primary sources.

permalink
report
parent
reply
2 points
*

That’s good. Ooh NotebookLM from Google just added in-line citations (per Hard Fork podcast). I think that’s the way: see what looks interesting (mentally trying not to take anything to heart) and click and read as usual.

BeyondPDF for Mac does something similar: semantic searches your document but simply returns likely matches, so it’s just better search for when you don’t remember specific words you read or want to find something without knowing the exact search criteria.

permalink
report
parent
reply
11 points

Tried it with ChatGPT 4o with a different title/author. Said it couldn’t find it. That it might be a new release or lesser-known title. Also with a fake title and a real author. Again, said it didn’t exist.

They’re definitely improving on the hallucination front.

permalink
report
reply

I tried to use ChatGPT to find a song that had a particular phrase in it. I could only remember that phrase, not the song or the band.

It hallucinated a band and a song and I almost walked away thinking I knew the answer. Then I remembered this is ChatGPT and it lies. So I looked up through conventional means that band and song.

Neither. Existed.

So I went back to ChatGPT and said “<band> doesn’t even exist so they couldn’t have written <song> (which also doesn’t exist)”. It apologized profusely and then said another band and song. This time I was wary and checked right away at which point, naturally, I discovered neither existed.

So I played with ChatGPT instead and said “Huh, those guys look interesting. What other albums have they released and what hits have they written?”

ChatGPT hallucinated an entire release catalogue of albums that don’t exist, one of which was published on a label that doesn’t exist, citing songs that didn’t exist as their hits, even going so far as to say the band never reached higher than #12 on Billboard’s list.

ChatGPT is a dangerous tool. It’s going to get someone killed sooner, rather than later.

permalink
report
reply
11 points

Did you ever find the song?

permalink
report
parent
reply

Nope. And it wasn’t important enough for me to bother finding. I just thought it would be an interesting test of degenerative AI’s incapabilities.

permalink
report
parent
reply