A recent article in the New Yorker entitled “Why Smart People Are Stupid” has been getting a lot of play in the media lately. The article is based on a study that shows that being “cognitively sophisticated” makes you more unaware of your own biases, and does not protect you from having less bias than those with a lower sophistication. This inability for intelligence to inoculate you against bias then makes up the headline above.
As I have already written about in primers here and here, much of this cognitive information processing research is based on a dual-process model of human cognition. Based largely on the work of Nobel prize-winning psychologist Daniel Kahneman, humans are found to engage in two types or “systems” of cognition. System 1 thinking is described as quick, shoot-from-the-hip style thinking, relying on cognitive shortcuts called heuristics. As the brain is a miser of cognitive resources, System 1 thinking is the go-to style, a back of the envelope calculator based on basic rules, emotional reactions, and gut-instincts. Conversely, System 2 thinking is described as systematic, or relying on effortful processing strategies that include a thoughtful evaluation of the evidence at hand. Of course we would all hope to engage in this kind of thinking all the time, but because of the resources required for System 2 thinking, we filter much of what we think about through System 1, as there are literally not enough cognitive resources to evaluate all the information we are bombarded with so critically.
As psychologists poke and prod the human mind, new paths are being cut in the realm of cognition. We want to know what makes individual people think the way they do. Cognitive psychology then must devise clever ways of teasing out our thought processes in order to determine how those processes are affected by various variables. One of the most interesting areas in this research is on bias itself. Not only do we want to know how thinking works, we want to know how thinking fails to work properly. Bias is inescapably linked to our human cognition, and understanding it is therefore vital to anyone interested in critical thinking.
What the meat of the study looked at, and what generated the designation of “stupid” from the New Yorker piece, was the “bias blind spot” (BBS). This bias is the result of being able to spot cognitive mistakes in others without subjecting ourselves to the same scrutiny. The BBS leaves us systematically ratings others at being more susceptible to bias than we are, which the study supports by looking at seven different cognitive biases. Previous research offers two explanations for this “meta-bias.” First, we have a false belief that we perceive and respond to the world objectively, or we have a “naïve realism” (West, Meserve, & Stanovich, 2012). Therefore if someone has a belief that differs from ours, it must be a result of their being biased, or so we think. Second, we have an overreliance on introspection, and when we don’t detect any evidence of bias in ourselves, we again think that others must be making the mistake, not us. The problem of course is that many biases work below conscious awareness, and are not found even with the deepest of soul-searchings.
Up until this study, most of the research in the field of biases and heuristics has shown that as “cognitive sophistication” goes up, bias goes down. In the present study (which was comprised of two studies for generalization), participants were asked questions relating to seven classic biases like the framing, anchoring, and outcome biases, and had their level of “cognitive sophistication” measured by reporting their SAT scores, taking a “Cognitive Reflection Test,” taking a “Actively Open-Minded Test,” along with two other measures. Interestingly, the Cognitive Reflection Test is the same battery of questions that caused an undermining of religious thinking in the face of analytical questions in a study that was highly publicized just a short time ago.
The study found a number of key results: all participants displayed significant BBSs, the BBS scores were positively correlated with the measures of intelligence, and people that were aware of their BBSs were no more likely to overcome them. Neither of the two studies in the paper provided evidence that cognitive sophistication decreases the bias blind spot.
You might then raise the same question that I did: what if “smarter” people actually did have less bias, so then their rating of others being more biased than themselves would be accurate? The studies also controlled for this and found that the null hypothesis (assuming no interaction between variables) was favored in every case.
This is all interesting, and fascinating from a perspective of a dual-process psychology, but I believe that the media has gone far beyond what the study actually says (surprise, surprise). When looking at the actual data in the study, I find no reason to call smart people “stupid.”
Looking over the analyses in the study for myself, I am not skeptical of the results; I am skeptical of the conclusions drawn from them. The New Yorker says:
And here’s the upsetting punch line: intelligence seems to make things worse. The scientists gave the students four measures of “cognitive sophistication.” As they report in the paper, all four of the measures showed positive correlations, “indicating that more cognitively sophisticated participants showed larger bias blind spots.” This trend held for many of the specific biases, indicating that smarter people (at least as measured by S.A.T. scores) and those more likely to engage in deliberation were slightly more vulnerable to common mental mistakes.
However, if you look at the data, these positive correlations are meager. They range from 0.096 to 0.260, indicating a very conservative relationship. As far as I can tell (the study was rather vague in the methods section) these relationships were reported using a statistical indicator called “Pearson’s R.” Very generally, this is a measure of correlation, and varies from no relationship (0.0) to a perfect relationship (1.0) between two variables. Also very generally, the size of an effect of this kind is considered strong at 0.50, moderate at 0.30, and weak at 0.10 and below. Considering this rule of thumb, none of the positive correlations showing more intelligent people have larger bias blind spots are even of moderate strength (noting that the relationship between the BBS and the Cognitive Reflection Test of analytical thinking was below weak).
There is another statistical factor compounding these weaknesses. When we multiply “Pearson’s R” by itself, we get a number (r2) that indicates how much of the variance (the amount that a variable “spreads out” along a scale) is shared by the two variables being correlated. Let’s use the strongest correlation, 0.260, and square it to get 0.0676 or 6.76% variance explained. If we then imagine two circles, one for the measure of intelligence and one for the measures of the BBS, variance explained could be thought of as how much those circles overlap. It’s not hard to see how circles that only overlap to roughly 7% of their respective surfaces represent a weak connection indeed.
If we can say anything about these results, it is that they are weak at best. So weak in fact that they could be better explained by other factors. Accordingly, the authors of the paper noted some of these possible factors (after remarking on how conservative these positive correlations were), something that was left out of the New Yorker piece.
For instance, the authors, far from calling smart people “stupid,” simply state, “cognitive ability provides no inoculation at all from the bias blind spot…” (West, Meserve, & Stanovich, 2012). The authors also note that this is in line with the idea that many of our biases operate below strategic control, meaning that they occur fundamentally unaffected by intelligence. Lastly, the authors note that those with a higher intelligence may have experienced a “justified rating” effect, where, because of their intelligence, they expected to do better than others of lower intelligence. However, because the bias questions asked had no association with cognitive capabilities, a “hostile environment” was created for the more intelligent people, hampering their ability to introspect and increasing their bias blind spot.
The authors of the paper did not offer any conclusions which could be construed as “smart people are stupid,” they offered alternative and plausible explanations for what they found in the face of rather weak data supporting an interesting result. What they found was very interesting in itself, that everyone has a bias blind spot and intelligence does not protect you from it, but somehow the media twisted it into something for taking those smarties down a peg.
From this study, can we really say that smart people are “stupid”? I would have to say no, we can’t. The relationships supposedly demonstrating this effect were weak, and the authors themselves couched their results in a context of alternative explanations. For the sake of context, also remember that this was but one study that contradicted a whole lot of research into heuristics and biases. Again, I am not discounting their findings, but I would pump the brakes a bit if I found something that went against years of scholarship.
As the authors conclude, having intelligence offer no reprieve from bias is an outcome of our dual-process thinking style. System 1, the quick and dirty decision maker, often betrays us before our more systematic process can rectify any mistakes. However, heuristics and biases can also be incredibly useful, especially for intelligent people. Imagine the well-honed heuristics that allow a scientist to quickly gaze through a large pool of data and sniff out a possible relationship. Would this be biased? Yes. Can it be useful to think like this? Of course. I could readily conceive of a situation where an intelligent person being more biased would be beneficial (perhaps a homicide investigator is more biased towards valuing physical evidence than imagined motives) but I am speculating here.
Move intelligence out of the psychology lab and into the real world, and a highly biased system of well-honed heuristics can be immensely useful, especially in science. Bias isn’t always a good thing, I am not meaning to suggest otherwise, but it does not make you stupid. Do we actually define “stupid” as falling prey to bias? Because there isn’t anyone who doesn’t.
West, R. F., Meserve, R. J., & Stanovich, K. E. (2012, June 4). Cognitive Sophistication Does Not Attenuate the Bias Blind Spot. Journal of Personality and Social Psychology.