Thanks to @General_Effort@lemmy.world for the links!

Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior

Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0

Here’s a link to a preprint: https://arxiv.org/abs/2408.10234

You are viewing a single thread.
View all comments

We don’t think in “bits” at all because our brain functions nothing like a computer. This entire premise is stupid.

permalink
report
reply
35 points
*

Bit in this context refers to the [Shannon](https://en.wikipedia.org/wiki/Shannon_(unit\)) from information theory. 1 bit of information (that is, 1 shannon) is the amount of information you receive from observing an event with a 50% chance of occurring. 10 bits would be equivalent to the amount of information learned from observing an event with about a 0.1% chance of occurring. So 10 bits in this context is actually not that small of a number.

permalink
report
parent
reply
18 points
*

The paper gives specific numbers for specific contexts, too. It’s a helpful illustration for these concepts:

A 3x3 Rubik’s cube has 2^65 possible permutations, so the configuration of a Rubik’s cube is about 65 bits of information. The world record for blind solving, where the solver examines the cube, puts on a blindfold, and solves it blindfolded, had someone examining the cube for 5.5 seconds, so the 65 bits were acquired at a rate of 11.8 bits/s.

Another memory contest has people memorizing strings of binary digits for 5 minutes and trying to recall them. The world record is 1467 digits, exactly 1467 bits, and dividing by 5 minutes or 300 seconds, for a rate of 4.9 bits/s.

The paper doesn’t talk about how the human brain is more optimized for some tasks over others, and I definitely believe that the human brain’s capacity for visual processing, probably assisted through the preprocessing that happens subconsciously, or the direct perception of visual information, is much more efficient and capable than plain memorization. So I’m still skeptical of the blanket 10-bit rate for all types of thinking, but I can see how they got the number.

permalink
report
parent
reply
Deleted by creator
permalink
report
parent
reply
7 points

It’s a fair metric IMO.

We typically judge super computers in FLOPS, floating-point-operations/sec.

We don’t take into account any of the compute power required to keep it powered, keep it cool, operate peripherals, etc., even if that is happening in the background. Heck, FLOPs doesn’t even really measure memory, storage, power, number of cores, clock speed, architecture, or any other useful attributes of a computer.

This is just one metric.

permalink
report
parent
reply
0 points

10 shannons, that is, 10 bits, each with 50% probability would be equivalent to the amount of information gained from observing an event with 1/1024 chance of occurring, not 1/10. Thats because this unit gets combined multiplicatively. The wikipedia article mentions that if there are 8 possible events with equal probability, the information content would be 3 shannons.

permalink
report
parent
reply
8 points

Right, 1/1024 is 0.0009765625 or about 0.1%.

permalink
report
parent
reply
5 points

I also don’t have 10 fingers. That doesn’t make any sense - my hands are not numbers!

Ooooor “bits” has a meaning beyond what you assume, but it’s probably just science that’s stupid.

permalink
report
parent
reply
Deleted by creator
permalink
report
parent
reply
14 points
*

You say “we don’t think in bits because our brains function nothing like computers”, but bits aren’t strictly related to computers. Bits are about information. And since our brains are machines that process information, bits are also applicable to those processes.

To show this, I chose an analogy. We say that people have 10 fingers, yet our hands have nothing to do with numbers. That’s because the concept of “10” is applicable both to math and topics that math can describe, just like “bits” are applicable both to information theory and topics that information theory can describe.

For the record: I didn’t downvote you, it was a fair question to ask.

I also thought about a better analogy - imagine someone tells you they measured the temperature of a distant star, and you say “that’s stupid, you can’t get a thermometer to a star and read the measurement, you’d die”, just because you don’t know how one could measure it.

permalink
report
parent
reply
3 points

Also supposing it did, I’m quite sure that everyone’s brain would function at different rates. And how do you even measure those people that don’t have an internal monologue? Seems like there is a lot missing here.

permalink
report
parent
reply
4 points

It’s an average. The difference between two humans will be much less than the difference between humans and machines.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 14K

    Monthly active users

  • 6.8K

    Posts

  • 158K

    Comments