Some problems are so complex that you have to be highly intelligent and well informed just to be undecided about them. - Laurence J. Peter

Sunday, March 22, 2009

Motivated Skepticism and Motivated Credulity

Two biases that lead to "stickiness" of political beliefs are motivated skepticism and motivated credulity. When I encounter an argument opposing some strongly-held belief of mine, I am prone to examine the argument extremely skeptically, never giving the benefit of the doubt. If I find the slightest flaw - a minor factual mistake, a questionable statistic, an improbable leap of logic - I am tempted to dismiss the whole as clearly the work of a dishonest or incompetent mind. This doesn't feel wrong. It feels like being logical, and upholding the highest standards of debate. It feels righteous. This is motivated skepticism, also called disconfirmation bias.

Conversely, motivated credulity - also called the prior attitude effect - is to give a free pass to weak arguments whose conclusions I agree with. Nobody's perfect, right? This writer is clearly a paladin of a just and noble cause. If he accidentally misquoted someone, or relied on outdated statistics, or his work lacks some key citations - big deal. I won't quibble over the details. That would be petty.

I hope I make these errors less frequently than I used to, but I suspect it's impossible to be sure I'm free of it altogether. I recall some particularly egregious examples from my past, particularly when arguing about the Iraq war in 2002/2003. I was 18-19 then, and new to politics. I might write about my errors in more detail sometime.

Rule of thumb: Be skeptical of things you learned before you could read. E.g. religion. -- Ben Casnocha
Ben is right: lacking the ability to read will severely hamper your ability to form sound beliefs. Lacking an understanding of cognitive biases - such as motivated skepticism - is also a risk factor for dubious beliefs. I remember dismissing the arguments of those I thought fools based on a few tangential errors, back before I'd heard the term "motivated skepticism". That was my mistake. So I'd go further, and propose this rule of thumb: be skeptical of beliefs you formed and defended before you understood motivated skepticism.

This implies that everyone in the world should drastically lower their confidence in their cherished beliefs. Let's all do that.

I came across an example of motivated skepticism recently. Here's Andi Hazelwood and Albert Bartlett talking about peak oil and the limits of natural resources. They're discussing Malthus, saying he was basically right.
AB: I read Malthus three times. Malthus understood the problems of limits, and he understood these things. And his timetable--he couldn't have anticipated the mechanization of agriculture, which has greatly increased agricultural production worldwide...And I think with these limits showing up in terms of peak oil, peak natural gas--I think with these limits showing up people will have to reassess their proud claim that "we've proven Malthus wrong." Malthus, I think, will turn out to be... "well, yeah, he understood the problem, he was right." And when some learned scholar tells me "we've proven Malthus wrong," that scholar is telling me about himself; he's not telling me about Malthus."
The discussion then moved, as it should, to the economist Julian Simon, who would have disagreed with that. Here is how I wish the conversation had gone:
A: Julian Simon would say that agricultural innovations weren't accidental or coincidental, because whenever food became scarce, food prices increased, which increased the reward to those who found ways to increase food production. I guess we should address that.

B: Well, Norman Borlaug is considered the father of the Green Revolution, developing high-yield, disease-resistant wheat varieties. It's said that he saved millions of lives. So what motivated him?

A: According to this interview, he says he was motivated by seeing so much human misery. He was a teenager during the Depression.

B: So Simon was wrong! Borlaug wasn't motivated by money!

A: Well, maybe. Simon said that food shortages would lead to innovation through the mechanism of increased prices, and therefore growth could continue. In this particular case the mechanism seems to have been altruism rather than personal gain, but the process still worked. A resource shortage led to innovation and ensured continued growth. Ideally we would examine more examples of the motivations of people who improved agricultural yields.

B: Ok...so what does this mean for peak oil and peak natural gas, which were talking about earlier?

A: Simon would probably say that we can expect innovation to solve resource shortages for those problems too.
My fantasy pundits sound so reasonable. The actual conversation was an exercise in motivated skepticism. Instead of addressing the criticism Julian Simon would have made to them, they found an irrelevant quote to attack:
AH: And I think we should point out that Julian Simon infamously said that we had enough technology in our minds and in our libraries to allow population to continue growing for the next 7 [million] years.

AB: That's an interesting story...my correspondent asked me, "What would the world population be if it just grew at the present rate of 1% per year"--which sounds terribly small as a growth rate--"if it grew at 1% per year for 7 million years?" Well...[it's] a number that you get by writing 1 followed by 30,000 zeroes...1 followed by 85 is the number of atoms estimated in the known universe, and Julian's population size would be 1 followed by 30,000 zeroes.

AH: So, no amount of technology is ever going to make it possible for those people to fit on earth.
...
AB: There aren't enough atoms to make that many people. But Julian was worshiped by the people in Washington who wanted to hear is message. And these people were often politicians who had no scientific judgment. But Simon had a Ph.D. And he was reasonably bright in the sense that he was bright enough to know what it was that important clients wanted to hear. And so he composed things they wanted to hear. And so they were very much enamored of him.
This could be my motivated credulity talking, but I think that when you find an assertion, and you add an assumption (a 1% growth rate) to that assertion and then attempt a reductio ad absurdum, then you've only disproved your assumption, not the assertion. To be fair, Julian's assertion of seven million years of growth on Earth seems a silly prediction in several ways. But regardless of that line's worth, it was irrelevant to their argument.

So watch out for motivated skepticism and motivated credulity. And resist, if you can, the temptation to do what I just did - using my knowledge of cognitive biases as a rhetorical weapon to bludgeon my enemies. First, know yourself.