To be clear, sometimes authority bias is good and proper. For instance, valuing the opinion of a climate scientist who has been studying climate chaos for thirty years more than your Aunt who saw Rush Limbaugh say climate change is a hoax in the 1990s is normal and rational.
Basically, authority bias as a reasoning flaw stems from misidentifying who is authoritative on a subject.
In a vacuum, appealing to authority is fallacious. An idea must stand up on its own merits.
IRL, things get fuzzy. No one has the expertise and time to derive everything from first principles and redo every experiment ever performed. Thus we sadly have to have some level of trust in people.
As long as the paper has the experiment well documented and it’s double blind, you don’t need to appeal to authority.
I guess authority bias is most absurd when one tries to use it as a crutch to validate an argument.
You should believe me simply because ‘x’ researcher said this about the topic
I have to respectfully disagreed with your example. Ostensibly the researcher should be an authority. I think the example given in the chart is not quite right either. I think the confusion comes from the three definitions of “Authority”.
-
the power or right to give orders, make decisions, and enforce obedience. “he had absolute authority over his subordinates”
-
a person or organization having power or control in a particular, typically political or administrative, sphere. “the health authorities”
-
the power to influence others, especially because of one’s commanding manner or one’s recognized knowledge about something.
In your example the “Authority” is definition 3, someone with specialized knowledge of a topic that should be listened to by those who are lay on the topic.
In the chart I think they were trying to go for 1, which is the correct source of Authority Bias, but they didn’t want to step on toes or get political. The actual example is someone who has decision authority like a police officer or politician or a boss at a workplace who says things and a listener automatically believes them regardless of the speakers actual specialized knowledge of the topic they are speaking on. A better example would be “Believing a vaccine is dangerous because a politician says it is.”
This all feeds into a topic I have been kicking around in my head for a while that I have been contemplating attempting to write up as a book. “The Death of Expertise”. So many people have been so brainwashed that authorities in definition 3 are met with a frankly asinine amount of incredulity, but authorities in the first are trusted regardless of education or demonstrable specialized knowledge.
I’ll also have to respectfully disagree with you on this. If I’m listening to someone speak on a topic who is by your 3rd definition an authority on it, that is not a yardstick for them to claim correctness. Yes, i might probably be better off listening to them than a lay person, but it still doesn’t give them the right to claim correctness nor does it grant me the right to rehash these claims and say that i should be listened to since I’m regurgitating the words of an expert. All assertions should be backed up by verifiable sources.
I’m interested to hear about that book though
YSK: the Dunning-Kruger effect is controversial because it’s part of psychology’s repeatability problem.
Other famous psychology experiments like the ‘Stanford prison experiment’ or the ‘Milgram experiment’ fail to show what you learned in psych101. The prison experiment was so flawed as to be useless, and variations on the Milgram experiment show the opposite effect from the original.
For those familiar with the Milgram experiment: one variation of the study saw the “scientist” running the test replaced with a policeman or a military officer. In these circumstances, almost everybody refused to use high voltage.
Controversial in the sense that it can be easily applied to anyone. There is some substance to the idea that a person can trick themselves into thinking they know more based on limited info. A lot of these biases are like that, they aren’t cut and dry but more of an gray area where people can be fooled in various ways. Critical thinking is hard even if it’s taught, and it’s not taught well enough or at all.
And all of that is my opinion and falls into various biases, but oh well. The easiest person to fool is yourself because we are hardwired in our brain to want to be right, with rewards to ourselves when we find things that help confirm it even if the evidence is not valid. I think the best way to try and avoid the pitfalls is to always back up your claim with something. I’ve found myself often(!) erasing a response to someone because what I was going to reply didn’t have the data that I thought it did and I couldn’t show I was correct after I dug a bit to find something.
I almost deleted this for the very reason, but I want to see how it hits. I feel that knowing there’s a lot of biases that anyone can fall into can help form better reasoning and argument.
What bias is it if the only entry I’ve read in this table is the one for confirmation bias?
For negativity bias my wife just told me a great technique that she uses for that. Come up with a list of people whose opinions matter to you. Any time you question yourself, imagine how each person on that list would react to what you did. Since those are the only people whose opinions matter to you, if it’s mostly positive, then you should feel proud of your choice.
Actually the reason I order the last item the server mentioned is because of crippling social anxiety