Tag Archives: cognitive bias

Why anti-vaxxers get it so wrong

By Tim Harding

The inability to accurately appraise one’s own knowledge is a cognitive bias known as the Dunning-Kruger Effect, first identified from social psychology experiments conducted in 1999. Dunning-Kruger effects occur when individuals’ lack of knowledge about a particular subject leads them to inaccurately gauge their expertise on that subject. Ignorance of one’s own ignorance can lead people who lack knowledge on a subject to think of themselves as more expert than those who are comparatively better informed.

A recent study published in the peer-reviewed journal Social Science and Medicine (and summarised in The Conversation) demonstrated that at least some anti-vaccination views are based on the Dunning-Kruger Effect.  The study found that 71 per cent of those who strongly endorse misinformation about the link between vaccines and autism feel that they know as much or more than medical experts about the causes of autism, compared to only 28 per cent who most strongly reject that misinformation.

The researchers found that nearly a third, or 30 percent, of people who think that they know more than medical experts about the causes of autism strongly support giving parents the latitude to not vaccinate their children. By contrast, 16 percent of those who do not think that they know more than medical professionals felt the same way.

The study also found that people who think they know more than medical experts are more likely to trust information about vaccines from non-expert sources, such as celebrities. These individuals are also more likely to support a strong role for non-experts in the process of making policies about vaccines and vaccination.

Whilst these recent research findings may not come as a surprise to seasoned skeptics, we now have  empirical evidence to explain why at least some anti-vaccination views are so irrational.

1 Comment

Filed under Logical fallacies, Reblogs

David Dunning on the Dunning-Kruger Effect

The Dunning–Kruger effect is a cognitive bias in which low-ability individuals suffer from illusory superiority, mistakenly assessing their ability as much higher than it really is. Dunning and Kruger attributed this bias to a metacognitive inability of those of low ability to recognize their ineptitude and evaluate their ability accurately. Their research also suggests corollaries: high-ability individuals may underestimate their relative competence and may erroneously assume that tasks which are easy for them are also easy for others. In Dr. Dunning’s own words:

“In 1999, in the Journal of Personality and Social Psychology, my then graduate student Justin Kruger and I published a paper that documented how, in many areas of life, incompetent people do not recognize — scratch that, cannot recognize — just how incompetent they are, a phenomenon that has come to be known as the Dunning-Kruger effect. Logic itself almost demands this lack of self-insight: For poor performers to recognize their ineptitude would require them to possess the very expertise they lack…”

“What’s curious is that, in many cases, incompetence does not leave people disoriented, perplexed, or cautious. Instead, the incompetent are often blessed with an inappropriate confidence, buoyed by something that feels to them like knowledge.”

“Some of our most stubborn misbeliefs arise … from the very values and philosophies that define who we are as individuals. Each of us possesses certain foundational beliefs — narratives about the self, ideas about the social order—that essentially cannot be violated: To contradict them would call into question our very self-worth. As such, these views demand fealty from other opinions. And any information that we glean from the world is amended, distorted, diminished, or forgotten in order to make sure that these sacrosanct beliefs remain whole and unharmed.”

1 Comment

Filed under Quotations

Confirmation bias

Confirmation bias, also called myside bias, is the tendency to search for, interpret, favor, and recall information in a way that confirms one’s beliefs or hypotheses while giving disproportionately less attention to information that contradicts it. It is a type of cognitive bias and a systematic error of inductive reasoning. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for emotionally charged issues and for deeply entrenched beliefs.11899996_1020967341267917_2539089030437728247_n

People also tend to interpret ambiguous evidence as supporting their existing position. Biased search, interpretation and memory have been invoked to explain attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence), belief perseverance (when beliefs persist after the evidence for them is shown to be false), the irrational primacy effect (a greater reliance on information encountered early in a series) and illusory correlation (when people falsely perceive an association between two events or situations).
 

Leave a comment

Filed under Logical fallacies