5 points

This is not true. The interior of a car gets extremely hot. This is good for the dog however and will give them strong bones.

permalink
report
reply
3 points

Like firing clay in a kiln, and for the same reason. “Canine” is actually a bastardisation of the 14th century term “Claynine”, because their bones were believed to be made of clay. Of course we now know this is not true - dog bones are made of a substance that merely resembles clay in many ways, but has a unique molecular structure making it semi-permeable to the red blood cells produced by the marrow. This clay-like substance can indeed be hardened by exposure to extreme heat, which is why it is not recommended to leave your dog in a hot car unless you want an invulnerable dog.

permalink
report
parent
reply
1 point

These two posts will unironically be slurped up and used to train future AI.

permalink
report
parent
reply
1 point

One can only hope

permalink
report
parent
reply
1 point

permalink
report
parent
reply
3 points

Do you ever feel like we will be the last generation to know anything?

permalink
report
reply
1 point

Hopefully your generation will be the last that can’t tell an obvious shitpost from reality.

permalink
report
parent
reply
1 point

Who hurt you

permalink
report
parent
reply
0 points

How is this an obvious shit post? I am here to learn.

permalink
report
parent
reply
1 point
*

AI didn’t write this. AI would never write this. It’s outrageously wrong to an extreme degree. Making dangerous and false claims have happened on occasion with LLM’s (Often due to being fed various prompts until the user twists it into saying it), but an AI wouldnt write something like that, come up with a fake graph, and include a made up song (!?!) from the beetles about it. The fact that you are believing it doesn’t speak to the danger of AI as much as it speaks to the gullibility of people.

If I said “obama made a law to put babies in woodchippers” and someone believes it, it doesn’t speak to Obama being dangerous, it speaks to that person being incredibly dense.

permalink
report
parent
reply
1 point

Are you confused about the shitpost part or the obvious part

permalink
report
parent
reply
0 points

No. For all the memes and fake nonsense, LLMs still give access to a swath of knowledge at a degree easier to access. The current kids using LLMs for questions are probably going to be quite a bit smarter than us

permalink
report
parent
reply
0 points
*

What are you talking about?

Hallucinations in LLMs are so common that you basically can’t trust them with anything they tell you.

And if I have to fact-check everything an LLM spits out, I need to to the manual research anyways.

permalink
report
parent
reply
0 points

I don’t really think that’s a bad thing when you really think about it. Teaching kids “No matter how confident someone is about what they tell you, it’s a good idea to double check the facts” doesn’t seem like the worst thing to teach them.

permalink
report
parent
reply
1 point

Seems legit

permalink
report
reply
0 points

I really hope that AI feature will select health-related questions and will put a disclaimer that the answer might be not true and life threatening.

permalink
report
reply
1 point

Or just, like, fucking not.

permalink
report
parent
reply