Thanks to @General_Effort@lemmy.world for the links!

Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior

Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0

Here’s a link to a preprint: https://arxiv.org/abs/2408.10234

We don’t think in “bits” at all because our brain functions nothing like a computer. This entire premise is stupid.

permalink
report
reply
35 points
*

Bit in this context refers to the [Shannon](https://en.wikipedia.org/wiki/Shannon_(unit\)) from information theory. 1 bit of information (that is, 1 shannon) is the amount of information you receive from observing an event with a 50% chance of occurring. 10 bits would be equivalent to the amount of information learned from observing an event with about a 0.1% chance of occurring. So 10 bits in this context is actually not that small of a number.

permalink
report
parent
reply
18 points
*

The paper gives specific numbers for specific contexts, too. It’s a helpful illustration for these concepts:

A 3x3 Rubik’s cube has 2^65 possible permutations, so the configuration of a Rubik’s cube is about 65 bits of information. The world record for blind solving, where the solver examines the cube, puts on a blindfold, and solves it blindfolded, had someone examining the cube for 5.5 seconds, so the 65 bits were acquired at a rate of 11.8 bits/s.

Another memory contest has people memorizing strings of binary digits for 5 minutes and trying to recall them. The world record is 1467 digits, exactly 1467 bits, and dividing by 5 minutes or 300 seconds, for a rate of 4.9 bits/s.

The paper doesn’t talk about how the human brain is more optimized for some tasks over others, and I definitely believe that the human brain’s capacity for visual processing, probably assisted through the preprocessing that happens subconsciously, or the direct perception of visual information, is much more efficient and capable than plain memorization. So I’m still skeptical of the blanket 10-bit rate for all types of thinking, but I can see how they got the number.

permalink
report
parent
reply
Deleted by creator
permalink
report
parent
reply
7 points

It’s a fair metric IMO.

We typically judge super computers in FLOPS, floating-point-operations/sec.

We don’t take into account any of the compute power required to keep it powered, keep it cool, operate peripherals, etc., even if that is happening in the background. Heck, FLOPs doesn’t even really measure memory, storage, power, number of cores, clock speed, architecture, or any other useful attributes of a computer.

This is just one metric.

permalink
report
parent
reply
0 points

10 shannons, that is, 10 bits, each with 50% probability would be equivalent to the amount of information gained from observing an event with 1/1024 chance of occurring, not 1/10. Thats because this unit gets combined multiplicatively. The wikipedia article mentions that if there are 8 possible events with equal probability, the information content would be 3 shannons.

permalink
report
parent
reply
8 points

Right, 1/1024 is 0.0009765625 or about 0.1%.

permalink
report
parent
reply
5 points

I also don’t have 10 fingers. That doesn’t make any sense - my hands are not numbers!

Ooooor “bits” has a meaning beyond what you assume, but it’s probably just science that’s stupid.

permalink
report
parent
reply
Deleted by creator
permalink
report
parent
reply
14 points
*

You say “we don’t think in bits because our brains function nothing like computers”, but bits aren’t strictly related to computers. Bits are about information. And since our brains are machines that process information, bits are also applicable to those processes.

To show this, I chose an analogy. We say that people have 10 fingers, yet our hands have nothing to do with numbers. That’s because the concept of “10” is applicable both to math and topics that math can describe, just like “bits” are applicable both to information theory and topics that information theory can describe.

For the record: I didn’t downvote you, it was a fair question to ask.

I also thought about a better analogy - imagine someone tells you they measured the temperature of a distant star, and you say “that’s stupid, you can’t get a thermometer to a star and read the measurement, you’d die”, just because you don’t know how one could measure it.

permalink
report
parent
reply
3 points

Also supposing it did, I’m quite sure that everyone’s brain would function at different rates. And how do you even measure those people that don’t have an internal monologue? Seems like there is a lot missing here.

permalink
report
parent
reply
4 points

It’s an average. The difference between two humans will be much less than the difference between humans and machines.

permalink
report
parent
reply
49 points

Bullshit. just reading this and comprehending it, which is thought, far exceeds 10 bits per second.
Speaking which is conveying thought, also far exceed 10 bits per second.

This piece is garbage.

permalink
report
reply
3 points

You’d be surprised how litle information you need to do too much. 10 bits seems a little low to me tok, but that’s splitting things into 1/1024th every second. Not bad.

permalink
report
parent
reply
23 points
*

Speaking which is conveying thought, also far exceed 10 bits per second.

There was a study in 2019 that analyzed 17 different spoken languages to analyze how languages with lower complexity rate (bits of information per syllable) tend to be spoken faster in a way that information rate is roughly the same across spoken languages, at roughly 39 bits per second.

Of course, it could be that the actual ideas and information in that speech is inefficiently encoded so that the actual bits of entropy are being communicated slower than 39 per second. I’m curious to know what the underlying Caltech paper linked says about language processing, since the press release describes deriving the 10 bits from studies analyzing how people read and write (as well as studies of people playing video games or solving Rubik’s cubes). Are they including the additional overhead of processing that information into new knowledge or insights? Are they defining the entropy of human language with a higher implied compression ratio?

EDIT: I read the preprint, available here. It purports to measure externally measurable output of human behavior. That’s an important limitation in that it’s not trying to measure internal richness in unobserved thought.

So it analyzes people performing external tasks, including typing and speech with an assumed entropy of about 5 bits per English word. A 120 wpm typing speed therefore translates to 600 bits per minute, or 10 bits per second. A 160 wpm speaking speed translates to 13 bits/s.

The calculated bits of information are especially interesting for the other tasks (blindfolded Rubik’s cube solving, memory contests).

It also explicitly cited the 39 bits/s study that I linked as being within the general range, because the actual meat of the paper is analyzing how the human brain brings 10^9 bits of sensory perception down 9 orders of magnitude. If it turns out to be 8.5 orders of magnitude, that doesn’t really change the result.

There’s also a whole section addressing criticisms of the 10 bit/s number. It argues that claims of photographic memory tend to actually break down into longer periods of study (e.g., 45 minute flyover of Rome to recognize and recreate 1000 buildings of 1000 architectural styles translates into 4 bits/s of memorization). And it argues that the human brain tends to trick itself into perceiving a much higher complexity that it is actually processing (known as “subjective inflation”), implicitly arguing that a lot of that is actually lossy compression that fills in fake details from what it assumes is consistent with the portions actually perceived, and that the observed bitrate from other experiments might not properly categorize the bits of entropy involved in less accurate shortcuts taken by the brain.

I still think visual processing seems to be faster than 10, but I’m now persuaded that it’s within an order of magnitude.

permalink
report
parent
reply
4 points

Thanks for the link and breakdown.

It sounds like a better description of the estimated thinking speed would be 5-50 bits per second. And when summarizing capacity/capability, one generally uses a number near the top end. It makes far more sense to say we are capable of 50 bps but often use less, than to say we are only capable of 10 but sometimes do more than we are capable of doing. And the paper leans hard into 10 bps being a internally imposed limit rather than conditional, going as far as saying a neural-computer interface would be limited to this rate.

“Thinking speed” is also a poor description for input/output measurement, akin to calling a monitor’s bitrate the computer’s FLOPS.

Visual processing is multi-faceted. I definitely don’t think all of vision can be reduced to 50bps, but maybe the serial part after the parallel bits have done stuff like detecting lines, arcs, textures, areas of contrast, etc.

permalink
report
parent
reply
1 point

with an assumed entropy of about 5 bits per English word. A 120 wpm typing speed therefore translates to 600 bits per minute, or 10 bits per second. A 160 wpm speaking speed translates to 13 bits/s.

The problem here is that the bits of information needs to be clearly defined, otherwise we are not talking about actually quantifiable information. Normally a bit can only have 2 values, here they are talking about very different types of bits, which AFAIK is not a specific quantity.

the human brain tends to trick itself into perceiving a much higher complexity that it is actually processing

This is of course a thing.

permalink
report
parent
reply
3 points

The problem here is that the bits of information needs to be clearly defined, otherwise we are not talking about actually quantifiable information

here they are talking about very different types of bits

I think everyone agrees on the definition of a bit (a binary two-value variable), but the active area of debate is which pieces of information actually matter. If information can be losslessly compressed into smaller representations of that same information, then the smaller compressed size represents the informational complexity in bits.

The paper itself describes the information that can be recorded but ultimately discarded as not relevant: for typing, the forcefulness of each key press or duration of each key press don’t matter (but that exact same data might matter for analyzing someone playing the piano). So in terms of complexity theory, they’ve settled on 5 bits per English word and just refer to other prior papers that have attempted to quantify the information complexity of English.

permalink
report
parent
reply
7 points

You may be misunderstanding the bit measure here. It’s not ten bits of information, basically a single byte. It’s ten binary yes/no decisions to equal the evaluation of 1024 distinct possibilities.

The measure comes from information theory but it is easy to confuse it with other uses of ‘bits’.

permalink
report
parent
reply
3 points

What? This is the perfectly normal meaning of bits. 2^10 = 1024.

permalink
report
parent
reply
7 points

Only when you are framing it in terms of information entropy. I think many of those misunderstanding the study are thinking of bits as part of a standard byte. It’s a subtle distinction but that’s where I think the disconnect is

permalink
report
parent
reply
3 points
*

I think we understand a computer can read this text far faster than any of us. That is not the same as conscious thought though- it’s simply following an algorithm of yes/no decisions.

I’m not arguing with anything here, just pointing out the difference in what CPUs do and what human brains do.

permalink
report
parent
reply
3 points

You’re misunderstanding the terminology used then.

In information theory, “bit” doesn’t mean “bitrate” like you’d see in networks, but something closer to “compressed bitrate.”

For example, let’s say I build a computer that only computes small sums, where the input is two positive numbers from 0-127. However, this computer only understands spoken French, and it will ignore anything that’s not a French number in that range. Information theory would say this machine receives 14 bits of information (two 7-bit numbers) and returns 8 bits. The extra processing of understanding French is waste and ignored for the purposes of calculating entropy.

The article also mentions that our brains take in billions of bits of sensory data, but that’s ignored for the calculation because we only care about the thought process (the useful computation), not all of the overhead of the system.

permalink
report
parent
reply
0 points

I think I was pretty clear about a understanding or comprehension part, which is not merely input output.

permalink
report
parent
reply
2 points
*

Indeed it is. If you want to illustrate the point that silicon and copper are faster than bioelectric lumps of fat there are lots of ways to do this and it’s no contest, but this is not a well done study.

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
4 points

Yes, fixed it. Only thinking at 8 bits a second this morning

permalink
report
parent
reply
1 point

Right? They do nothing to expand upon what this plainly wrong claim is supposed to actually mean. Goddamn scientists need a marketing department of their own, because the media sphere in general sells their goods however the fuck they feel like packaging them.

permalink
report
parent
reply
0 points

How are you measuring it?

permalink
report
parent
reply
12 points
*

There’s no plausible way to even encode any arbitrary idea into 10 bits of information.

permalink
report
parent
reply
10 points
*

That doesn’t really matter, because 1 bit is merely distinguishing between 1 and zero, or some other 2 component value.
Just reading a single word, you understand the word between about 30000 words you know. That’s about 15 bits of information comprehended.
Don’t tell me you take more than 1.5 second to read and comprehend one word.

Without having it as text, free thought is CLEARLY much faster, and the complexity of abstract thinking would move the number way up.
1 thought is not 1 bit. But can be thousands of bits.

BTW the mind has insane levels of compression, for instance if you think bicycle, it’s a concept that covers many parts. You don’t have to think about every part, you know it has a handlebar, frame, pedals and wheels. You also know the purpose of it, the size, weight range of speed and many other more or less relevant details. Just thinking bicycle is easily way more than 10 bits worth of information. But they are “compressed” to only the relevant parts to the context.

Reading and understanding 1 word, is not just understanding a word, but also understanding a concept and putting it into context. I’m not sure how to quantize that, but to quantize it as 1 bit is so horrendously wrong I find it hard to understand how this can in any way be considered scientific.

permalink
report
parent
reply
2 points

You are confusing input with throughput. They agree that the input is much greater. It’s the throughput that is so slow. Here’s the abstract:

This article is about the neural conundrum behind the slowness of human behavior. The information throughput of a human being is about 10 bits/s. In comparison, our sensory systems gather data at ∼1⁢09 bits/s. The stark contrast between these numbers remains unexplained and touches on fundamental aspects of brain function: what neural substrate sets this speed limit on the pace of our existence? Why does the brain need billions of neurons to process 10 bits/s? Why can we only think about one thing at a time? The brain seems to operate in two distinct modes: the “outer” brain handles fast high-dimensional sensory and motor signals, whereas the “inner” brain processes the reduced few bits needed to control behavior. Plausible explanations exist for the large neuron numbers in the outer brain, but not for the inner brain, and we propose new research directions to remedy this.

permalink
report
parent
reply
0 points
*

Try to read this next part without understanding it. If you know English you will find it impossible to NOT find meaning in these letters displayed in a row. That’s more like a subconscious processing. If you’re learning to read English then there’s likely an active “thought” appearing in your experience. See a difference?

permalink
report
parent
reply
1 point

I absolutely do, which is why I concentrated on the THOUGHT part, as in understanding. You obviously can’t have understanding without thought. That’s the difference between data and information.
Please I have 40 years of experience in philosophic issues regarding intelligence and consciousness, also from a programmers perspective.

permalink
report
parent
reply
2 points

Sorry if my replies are annoying, I merely find this subject interesting. Feel free to ignore this.

It is not obvious to me why a being couldn’t have an “understanding” without a “thought”. I do not believe it’s possible to verify if a creature has a subjective experience but an “understanding” of the environment could be attributed to how a being performs in an environment (a crow gets foods that was just out of reach inside a tube of water by adding rocks to raise the water level). I have some experience on game-dev programming and limited understanding on consciousness as a personal interest, if that’s helpful.

permalink
report
parent
reply
1 point

Understanding it is active thought. And processing the words, as words with meaning, is required to formulate a relevant response.

The more than 10 bits each word is are part of your active thought.

permalink
report
parent
reply
2 points
*

I think we may disagree on term definitions.

I perceive “active thought” when trying to decipher parts of a sentence I do not already have an understanding of. If I already understand a part then no active thought is perceived by me - like driving a car when nothing eventful is happening. [Note: I don’t believe I have 100% accurate perception of my own subjective experience. Trying to focus on subjective experience at all instead of constantly being “lost in thought” is very short lived]

permalink
report
parent
reply
40 points

Because it’s a Techspot article, of course they deliberately confuse you as to what “bit” means to get views. https://en.wikipedia.org/wiki/Entropy_(information_theory) seems like a good introduction to what “bit” actually means.

permalink
report
reply
4 points

Base 2 gives the unit of bits

Which is exactly what bit means.

base 10 gives units of “dits”

Which is not bits, but the equivalent 1 digit at base 10.

I have no idea how you think this changes anything about what a bit is?

permalink
report
parent
reply
2 points

base 10 gives units of “dits”

I read ‘tits’ and about died laughing 😭

permalink
report
parent
reply
11 points

The external storage data and shannon are both called bits, exactly because they’re both base 2. That does not mean they’re the same. As the article explains it, a shannon is like a question from 20 questions.

permalink
report
parent
reply
1 point

Wrong. They are called the same because they are fundamentally the same. That’s how you measure information.

In some contexts, one wants to make a difference between the theoretical information content and what is actually stored on a technical device. But that’s a fairly subtle thing.

permalink
report
parent
reply
-2 points

That is a fair criticism.

permalink
report
parent
reply
4 points

Did you actually read it?
Because it’s not:

Base 2 gives the unit of bits

Which is exactly what bit means.

base 10 gives units of “dits”

Which is not bits, but the equivalent 1 digit at base 10.

permalink
report
parent
reply
38 points

ITT: A bunch of people who have never heard of information theory suddenly have very strong feelings about it.

permalink
report
reply
4 points

If they had heard of it, we’d probably get statements like: “It’s just statistics.” or “It’s not information. It’s just a probability.”

permalink
report
parent
reply
14 points

Some parts of the paper are available here: https://www.sciencedirect.com/science/article/abs/pii/S0896627324008080?via%3Dihub

It doesn’t look like these “bits” are binary, but “pieces of information” (which I find a bit misleading):

“Quick, think of a thing… Now I’ll guess that thing by asking you yes/no questions.” The game “Twenty Questions” has been popular for centuries as a thinking challenge. If the questions are properly designed, each will reveal 1 bit of information about the mystery thing. If the guesser wins routinely, this suggests that the thinker can access about million possible items in the few seconds allotted. Therefore, the speed of thinking—with no constraints imposed—corresponds to 20 bits of information over a few seconds: a rate of 10 bits/s or less.

The authors do draw a distinction between the sensory processing and cognition/decision-making, at least:

To reiterate: human behaviors, including motor function, perception, and cognition, operate at a speed limit of 10 bit/s. At the same time, single neurons can transmit information at that same rate or faster. Furthermore, some portions of our brain, such as the peripheral sensory regions, clearly process information dramatically faster.

permalink
report
reply
2 points
*

So ten concepts per second? Ten ideas per second? This sounds a little more reasonable. I guess you have to read the word “bit” like you’re British, and it just means “part.” Of course this is still miserably badly defined.

permalink
report
parent
reply
-5 points

But our brains are not digital, so they cannot be measured in binary bits.

permalink
report
parent
reply
13 points

There is no other definition of bit that is valid in a scientific context. Bit literally means “binary digit”.

Information theory, using bits, is applied to the workings of the brain all the time.

permalink
report
parent
reply
-9 points

How do you know there is no other definition of bit that is valid in a scientific context? Are you saying a word can’t have a different meaning in a different field of science?

permalink
report
parent
reply
3 points

All information can be stored in a digital form, and all information can be measured in base 2 units (of bits).

permalink
report
parent
reply
-2 points

But it isn’t stored that way and it isn’t processed that way. The preprint appears to give an equation (beyond my ability to understand) which explains how they came up with it.

permalink
report
parent
reply
3 points

Indeed not. So using language specific to binary systems - e.g. bits per second - is not appropriate in this context.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 14K

    Monthly active users

  • 6.8K

    Posts

  • 158K

    Comments