Podcast: Download (37.8MB)
Subscribe: RSS
by Zachary Wright
Introduction
Imagine for yourself, for a moment, that a critic of the church stumbles across some quotes about the Adam-God theory. “Brigham Young taught this, and seeing as he was a prophet, this must be what Mormons believe!” they reason to themselves. They post these quotes on social media, proclaiming that “Mormons believe that Adam was God the Father! Your leaders taught it, it’s what you profess to believe!” However, what this critic did not know is that as early as 1897, we have records of church leaders stating that Brigham “no doubt expressed his personal opinion or views upon the subject,” supporting the idea that members of the church are under no obligation to support the notion that Adam was God the Father (1). The critic doesn’t retreat from their position though, doubling down on how members of the church believe it. At the end of the day, their bias led them to an incorrect conclusion about what members of the church believe.
For those who are well acquainted with criticisms of the church, I’m sure that you can recall with ease similar instances wherein bias led people to incorrect conclusions about our beliefs. However, unlike most of the previous cognitive biases we’ve discussed, the bias outlined above can’t necessarily be learned by simply taking time to think about it. It’s a little different, and has been classified by many researchers as being called The Dunning-Kruger Effect, which is what we’ll be discussing today. Critical thinkers need to be aware of cognitive biases due to their possible impact on our daily decision-making and behavior, thus affecting our ability to arrive at conclusions and solve problems. So today, we’re going to be going over the Dunning-Kruger effect, it’s quasi-reverse bias (often referred to as Imposter Syndrome), and afterwards we’ll talk about what we can do to combat these biases. Let’s get into it.
How Confidence Relates to Bias
Now, I need to lay some groundwork for the rest of the article to make sense. It’s understood that confidence is related to bias in a few different ways. For example, the Overconfidence Bias is understood to be “the tendency people have to be more confident in their own abilities.” (2) This is demonstrated in a plethora of studies, such as one wherein it was shown that younger children rate their memory recall ability more highly than their older peers.(3) This makes sense…we talked about how dominant emotions (such as high confidence) are associated with cognitive biases in our previous episode. As we’ll soon see, however, the connection between confidence and bias goes a little deeper than that.
So, a group of researchers set up a study where they had participants determine whether groups of dots moved left or right. Sometimes the dots moved more noticeably in a given direction, and other times, the dots moved less noticeably in a given direction. After the participants made decisions about what direction the groups of dots were moving, they were shown evidence that either confirmed their suspicions (when they guessed right), or went contrary to their suspicions (when they guessed wrong). They then used magnetoencephalography, which is just a fancy method of scanning brain activity, to see how areas of the brain related to confidence behaved throughout this process. At the end of the study, the researchers stated the following:
As hypothesized, we found that after high confidence (vs. low confidence) decisions, accumulation of neural evidence was facilitated if it was confirmatory, but largely abolished if it was disconfirmatory…In other words, our MEG analysis reveals that high confidence leads to post-decision accumulation becoming “blind” to disconfirmatory evidence. (4)
To translate that, it basically means that when the participants were super confident in their decisions, it led them to be blinded to evidence that went against their decisions. If that sounds familiar, it should, because in our last episode this phenomena is described as being confirmation bias. This study was super useful in being able to demonstrate how our confidence can relate to bias in ways that we may not expect, and it’s important to retain this in mind as we proceed talking about more confidence based biases.
However, I do want to caution against any inference that confidence is a bad thing, or that confidence indicates weakness in a position. As we talked about in our article about epistemology, there are ways to be epistemically confident in something that leave us less susceptible to errors. We don’t have to associate confidence with bad ideas, and in my opinion, we shouldn’t! As we’ll discuss more later, a lack of confidence can lead to just as many, if not more problems in the long run. Even so, critical thinkers should be aware of how certain biases related to confidence can affect our ability to make decisions, and solve problems. As we’ll soon see, the effects of these kinds of biases can be long-lasting and far-reaching.
The Confidence Biases at Large
To begin explaining more concretely how bias and confidence may be related, an excellent place to start is to explore the Dunning-Kruger Effect. The Dunning-Kruger Effect earned its name from a pair of psychologists named David Dunning and Justin Kruger. They published a very famous paper in 1999 wherein they tested people on their social and intellectual skills. Later, they tested the participants again, asking them to rate themselves on how well they actually did. What did they find? They found that those who scored on the bottom percentile rated themselves as being much more competent than they actually were (5). Since that time, psychologists have referred to this effect as being the Dunning-Kruger effect, and it’s been a well-studied phenomenon ever since.
Those who are familiar with my previous article are familiar with the aspects that make up a cognitive bias. While each cognitive bias is different, it usually stems from people focusing on drawing patterns, prioritizing the information that correlates with what we already know, focusing on dominant/impactful data, and ignoring seemingly irrelevant information (6). Unfortunately for those who fall victim to the Dunning-Kruger effect, one could make a convincing argument that they are victims to all four of these heuristics in seemingly negative ways. First, they recognize the pattern that they’ve been able to accomplish tasks such as “deductive reasoning” before. However, there’s no guarantee that because they’ve successfully done something before, that they’ll successfully do it again. Next, they draw on the information that they already know, prioritizing this information, seemingly at the expense of other information. However, this leads to other problems, seeing as the information that they are seemingly missing is often important, and may complicate the simplicity of the answers they’re looking for. This becomes increasingly apparent as time goes on, but by the time that they realize it, a lot of the effects have already had long-lasting consequences.
To explain why this is, I first need to explain a common way that the Dunning-Kruger effect manifests itself. Let’s use myself as a hypothetical example. Arguably, I happen to be one of the few people right now doing a series on critical thinking in regards to LDS theology and history. Let’s say that I suddenly get a lot of popularity for my discussion about critical thinking. I suddenly gain thousands upon thousands of views on my videos, and chart a lot of internet traffic on my articles. People start reaching out to me, asking me questions about obscure aspects of LDS history, which I may know less about. I can tell you about the things I’ve studied (Epistemology, Evaluating sources, Logic, etc), but I’m less equipped to tell you about every single pioneer story ever recorded. However, because I want to be successful, and because I want to make people happy, I start exploring other topics that I have less experience in. Consequently, I begin saying more things that are wrong; but because I was right about some things, I have the confidence to keep persisting in my incorrect ideas. However, any attempts to correct my incorrect ideas would fall on deaf ears, because I would be far too inexperienced to recognize how I was wrong. This is the Dunning-Kruger effect in its purest form, and they (Dunning and Kruger) reported the principle in the following way:
Those with limited knowledge in a domain suffer a dual burden: Not only do they reach mistaken conclusions and make regrettable errors, but their incompetence robs them of the ability to realize it. (7)
In the formal business world, this idea is referred to as The Peter Principle, which indicates that managers will “prioritize current job performance in promotion decisions at the expense of other observable characteristics that better predict managerial performance.” (8) In other words, the managers look at what the employees do, and assume that their current performance will predict how well they will perform in a position that, in reality, would be too much for them. It’s events like this that often lead to the Dunning-Kruger effect, where the person being promoted (either formally, like in a business, or informally, such as on social media platforms), will suddenly begin to feel confident about their abilities, despite the often obvious lack of ability they may have in a given area. The bias here is evident. We’re recognizing patterns of our success. We focus on the dominant and impactful ideas that agree with our own ideas. We also ignore or downplay the issues that challenge our own ideas. After all, if we were right before, who’s to say that we won’t be right again? However, if we put that into syllogistic form, it falls apart rather quickly.
- P1: I was able to solve/answer X problem using X method
- P2: I am presented with Y problem
- Conclusion: I will be able to solve Y problem using X method
The premises, in this instance, do not support the conclusion, and I hope you can do some research to find out what fallacy this is ;). Proficiency in one area of study does not guarantee proficiency in another. I’m sure that the reader can imagine a plethora of examples of the Dunning-Kruger effect being present at work, in politics, and of course, religious discussion.
Now, I certainly hope that I’ve stated things that are accurate and useful to critical thinkers, but regardless of how well-intentioned I may be, this does not change the fact that the internet as a whole is a breeding ground for the Dunning-Kruger effect in ways that have been previously unprecedented. Normal, ordinary people with little to no formal training in a given field, are suddenly given power and influence over hundreds, if not thousands of people. They can be proclaiming utter nonsense about a given topic, and not a single person will realize that they have no idea what they’re talking about. This is why critical thinking skills are as important as they are. We need to be able to parse through information ourselves and recognize the strengths and weaknesses of the arguments at play. If we don’t, then it becomes all the easier for us to fall victim to cognitive biases, and otherwise fallacious thinking.
Now, I will introduce one caveat here, just to cover my bases. There is a bit of controversy right now regarding whether or not the Dunning-Kruger effect is actually a psychological bias. There are some writers and researchers now who are advocating for the idea that the Dunning-Kruger effect is more statistical in nature. (9) The question as to whether or not this is a psychological issue or a statistical issue is important to consider. However, the authors of these articles don’t deny that these behavioral patterns exist. Instead, they attribute it to other factors, such as the Overconfidence bias. Whatever the underlying cause is (whether that be psychological, statistical, or otherwise), I think these patterns of behavior and thought are worth mentioning and analyzing.
Flipping the script
We’ve talked about how people who are low-achieving tend to rate themselves as being much more proficient than how they actually are. However, that still leaves us with the question of how people who are proficient in their field of study view themselves. Now, you’d initially think that people who are proficient in their fields would likewise rate themselves as being significantly better than their peers in their profession. However, we actually find the opposite to be true. High-achieving, competent people often rate themselves as being less competent than their peers. (11) It’s my understanding that this doesn’t happen all the time, but one study estimated that it happens to about 70% of the population at least once in their lives. (12) Some authors note that this is often cited in association with Dunning and Kruger, though those two psychologists never expressly talk about it in their studies. (13) Even so, I do think that exploring this issue may prove to be useful.
The imposter effect, also known as Imposter Syndrome, was initially described by Pauline Rose Clance and Suzanne Imes, wherein it described that those who are extremely competent tend to downplay their abilities, feeling like they are less competent than they actually are. (14) While this pair of researchers initially assumed that this phenomenon was exclusive to women, most researchers accept the idea that this effect is not exclusive to women. (15) Some authors argue that it affects women more than men, while others suggest that Imposter Syndrome simply manifests differently among men and women. (16) Regardless, this confidence problem inherent within the imposter effect contributes to the Dunning-Kruger effect, due to how the people who are less competent end up being given a larger platform due to the Peter Principle. As demonstrated earlier, this can lead to a lot of problems within a short period of time.
The imposter effect can have some very negative effects on people and their performance. As you can imagine, it’s associated with higher rates of anxiety, depression. (17) Those feelings are serious, and can cause problems in a variety of aspects in our lives, but something just as problematic is how the imposter effect can negatively impact critical thinkers in their quest to solve problems. How so? Well, let’s say that you have a brand new foreign-speaking missionary who’s afraid to speak up, because their trainer appears much more competent than them in terms of speaking, or even understanding the lesson content. How about a bishop who resists a calling to become a stake president because he’s afraid that others will “discover his inadequacy.” Examples of similar phenomena can be found outside of LDS culture, but you can probably already see the principle at play. This can inhibit critical thinkers from making decisions that are genuinely going to solve problems.
That prompts the question though, how do we overcome the Dunning-Kruger effect and Imposter Syndrome? Well, I think that the ways I mentioned in the last article are worth noting. Slowing down, looking at things from a different perspective, and getting a multitude of opinions all may help decrease the effects and longevity of these effects. However, I think it serves us to examine this from a more theological standpoint: Both of these problems are resolved in a similar way. True humility is the answer here. To explain why, we first need to explore what humility is.
Humility, in our theological context, is described as “recognizing our dependence on God.” (18) I totally agree with that, and of course, an obvious application here is that humility would allow people to more easily admit when they don’t know something, hampering the effects of the Dunning-Kruger effect. Why? Because suddenly our eternal value doesn’t come from what we know, it comes from who we are. Humility, when channeled in an appropriate way, can allow us to more easily admit when we are wrong. If we really think about this, the imposter effect comes from basing our inherent value on the things we can do, instead of basing our value on who we are in relationship to God.
What about Imposter Syndrome though? Surely that must be different, right? Well, when we recognize that one of the most important aspects of our identity is that we are children of God, (19) it becomes much easier to recognize that we are not better than anyone else. However, the opposite is also true: no one is inherently better than us. Instead of feeling like we’re letting people down, those who are truly humble recognize that even if there are imperfections in what they do, their imperfections do not make them less than other people. Humility, in an ironic twist, decreases the fear that comes from failure. When we realize that our true worth comes from our inherent, unalienable relationship with God, we open the door to true, enduring happiness.
Now, I’m not blind to the fact that humility exists in places outside the church. General Christians, Atheists, as well as members of the church are all able to experience this kind of humility (though arguably for different reasons). However, I bring this idea up for a few reasons. First, obviously, I want to show how confidence and bias relate to each other, and likewise show a way that I think we can mitigate the effects of unhealthy confidence. I want to show one of many ways of how looking at things from different perspectives can be useful. This is the epitome of critical thinking: the idea that we can bring together data of different types, analyze it, and then use it to help us solve problems. This pattern is something that I hope everyone would try to emulate. You don’t even need to be a member of the church to appreciate how this method of data analysis can be useful.
In conclusion, I think that there are some important things to consider when we talk about how confidence can impact bias. First, we need to know some of the mechanics behind how bias and confidence are related. We also need to know some documented biases that relate to confidence, and we need to know how to fix them. Again, I want to reiterate that certainty isn’t a bad thing, but like most things, it should be employed after careful consideration and restraint. As we do so, it’ll be easier for us to arrive at correct conclusions, and eventually become the kinds of thinkers and believers God wants us to be.
References:
- https://catalog.churchofjesuschrist.org/assets/80389221-debe-4168-89a6-9e2bd7709a5c/0/369; The quote part comes from Joseph F. Smith, a contemporary to Brigham Young, and a future prophet and leader of the church. While this quote was said after Brigham died, other leaders, such as Orson Pratt, publicly denounced the doctrine while Brigham was still alive. Later, after meeting with Brigham, Pratt apologized. Pratt continued thereon to allow Brigham to have his opinion uninhibited, but stated that he would not teach the doctrine himself (link here). Orson was counseled by the brethren on the matter, but was never once disciplined by the church for this position. There are also points wherein Brigham himself was less dogmatic about the Adam-God theory, and you can read more about that here.
- https://ethicsunwrapped.utexas.edu/glossary/overconfidence-bias
- Piehlmaier D. M. (2020). Overconfidence Among Young Decision-Makers: Assessing the Effectiveness of a Video Intervention and the Role of Gender, Age, Feedback, and Repetition. Scientific reports, 10(1), 3984. https://doi.org/10.1038/s41598-020-61078-z
- Rollwage, M., Loosen, A., Hauser, T.U. et al. Confidence drives a neural confirmation bias. Nat Commun 11, 2634 (2020). https://doi.org/10.1038/s41467-020-16278-6
- Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of personality and social psychology, 77(6), 1121–1134. https://doi.org/10.1037//0022-3514.77.6.1121
- Korteling, J. E., Brouwer, A. M., & Toet, A. (2018). A Neural Network Framework for Cognitive Bias. Frontiers in psychology, 9, 1561. https://doi.org/10.3389/fpsyg.2018.01561
- https://www.psychologytoday.com/us/basics/dunning-kruger-effect
- https://economics.harvard.edu/files/economics/files/ms28914.pdf
- https://www.mcgill.ca/oss/article/critical-thinking/dunning-kruger-effect-probably-not-real?fbclid=IwAR13aDSHXbp-X5vsEoOOxLLsQbKNpxt77CnIqto5TllpIdoRehSOK4cRW1Q
- https://dictionary.apa.org/impostor-phenomenon; see also https://psycnet.apa.org/doiLanding?doi=10.1037%2F0022-3514.77.6.1121
- Gravois, J. (2007). You’re not fooling anyone. The Chronicle of Higher Education, 54(11), A1. Retrieved November 5, 2008 (link here, though there is a paywall to the full article)
- Magnus, J. R., & Peresetsky, A. A. (2022). A Statistical Explanation of the Dunning-Kruger Effect. Frontiers in psychology, 13, 840180. https://doi.org/10.3389/fpsyg.2022.840180; This is another one of those articles that tries to use statistics to explain the Dunning-Kruger effect.
- https://www.psychologytoday.com/us/blog/neuroscience-in-everyday-life/202308/the-history-of-imposter-syndrome
- Feigofsky S. (2022). Imposter Syndrome. HeartRhythm case reports, 8(12), 861–862. https://doi.org/10.1016/j.hrcr.2022.11.001 (if that link doesn’t work, try this one).
- https://online.utpb.edu/about-us/articles/psychology/exploring-imposter-syndrome-and-gender; See also https://www.forbes.com/sites/lucianapaulise/2023/03/08/75-of-women-executives-experience-imposter-syndrome-in-the-workplace/?sh=2da5b9576899; The first talks about how women experience it more than men, the latter says otherwise.
- https://www.apa.org/gradpsych/2013/11/fraud; Some sources claim that this affects those who are considered demographic minorities differently than those who are not considered demographic minorities. For example, one study suggests that for African Americans “impostor feelings are a stronger factor in mental health than perceived discrimination and possibly minority status stress” (link here). As you can see, imposter syndrome can have repercussions that can affect a lot of discourse about minority populations.
- https://www.churchofjesuschrist.org/study/scriptures/gs/humble-humility?lang=eng
- https://www.churchofjesuschrist.org/study/broadcasts/worldwide-devotional-for-young-adults/2022/05/12nelson?lang=eng
Further Study:
- “Why ignorance fails to recognize itself” Featuring David Dunning; This is pretty interesting, because one of the original researchers for which the “Dunning-Kruger” effect was named, explains more about his findings. Definitely interesting to listen to.
- https://www.churchofjesuschrist.org/study/general-conference/1989/04/beware-of-pride?lang=eng; This talk by Ezra Taft Benson shows how pride can be manifested in a few different ways, and is combatted most effectively by humility.
- https://www.ncbi.nlm.nih.gov/books/NBK585058: This gives a solid outline of the findings behind imposter syndrome, and likewise explores ways in which it manifests.
Zachary Wright was born in American Fork, UT. He served his mission speaking Spanish in North Carolina and the Dominican Republic. He currently attends BYU studying psychology, but loves writing, and studying LDS theology and history. His biggest desire is to help other people bring them closer to each other, and ultimately bring people closer to God.