Justifying the Jump

by Erica Dawson

I do not remember why I decided to jump out of a plane. But I had reserved a date, put down a sizeable - and nonrefundable - deposit, and, most significantly, bragged to everyone I knew that I was going to do it. There was no backing out, despite the fact that as the day neared I was growing increasingly doubtful.

Maybe this was not a good idea. Perhaps, rather than being daring, it was just being dumb.

To reassure myself, I took to the Internet. My Google search for "skydiving safe" returned a long list of sites. I clicked on the first, a blog hosted by a site selling skydiving gear, and scanned the article. The first lines held my answer: "You stand a better chance of dying in a car accident on the way to the dropzone than you do of dying from your jump. Skydiving is safer than driving a car." I did not need to read any further.

Whew! Not dumb. Daring. The data backed me up, and you can't argue with the data.

"You can't argue with the data" is a familiar persuasion device, one most often deployed when we are trying to defend beliefs that we consciously or unconsciously want to be true. Beliefs like, "I'm a smart, competent person. My politics are sound, my religious beliefs valid, my decisions informed. I'm healthy and wise. My views about the world are correct."

And skydiving is safer than driving a car.

It was only later that night at a gathering of friends (most of whom, like me, study decision biases) that my sound reasoning was called into question. "Safer comparing what?" asked one. "Miles driven to miles fallen? Number of car trips to number of jumps? Time in the sky versus time on the road?" "And how do you mean 'safer'?" asked another. "You can walk away from most car accidents. No one walks away from a parachute that didn't open."

In the end, the group decided that the proper comparison was the survival rate of people who board an operational airplane and land in it versus the survival rate of those who board and then jump out of it. And also, that I was dumb.

In trying to convince myself of something I really wanted to be true - that everything was going to be okay - I illustrated the very processes of motivated reasoning that I had spent my graduate career researching. Motivated reasoning describes the differential way we evaluate data about propositions we wish to believe and propositions we wish to reject. Our own orientation toward the implied hypothesis is the "motivation" part. What we do with the data is the "reasoning" part.

A large amount of research including my own demonstrates that when we are thinking about a favorable proposition - that is, one that is flattering, or accords with our worldview, or supports some line of action we have already decided to take - we tend to ask the implicit question, "Can I believe this?" We look for data that support what we want to believe and stop as soon as we find some. We do not think all that carefully about it but tend to take it at face value. We interpret ambiguous information as supporting our hypothesis. By applying a relatively low standard for acceptance, looking for supportive evidence, and thinking superficially about it, a person so motivated can almost always find some basis for believing. "Can I believe this?" Yes!

On the other hand, we turn into excellent critical thinkers when faced with a proposition we do not like - one that impugns some aspect of our identity or challenges a cherished worldview, for example. In this case, we implicitly ask not "Can I?" but "Must I believe this?" We want to see all the data, not just a subset or convenience sample. We think very carefully about it, always looking for the fatal flaws. By thoroughly considering all the information and thinking critically about it, people motivated to reject a proposition often do spot the inconsistencies, ambiguities, and statistical flaws inherent in any real-world body of evidence.

"Must I believe this?" No! It's a neat trick. We look for evidence in both cases, so no one can accuse us (and we cannot suspect ourselves) of bias. Both "Can I?" and "Must I?" are perfectly reasonable decision criteria. They are, in fact, embedded in the American legal system, which requires us to prove criminal cases "beyond a reasonable doubt" ("Must I?") but to show only a "preponderance of evidence," usually defined as greater than 50 percent, in civil cases ("Can I?"). (This is how O.J. Simpson was innocent of murder in one trial but guilty in another, based on the same evidence.)

The bias isn't in simply believing whatever we like at will. Rather, it's the subtler process of unconsciously applying a lower standard of acceptance to things we want to believe and a higher standard to things we want to reject. The result is that we believe ourselves to be both unbiased and right. This extends to social and political views on topics as wide-ranging as climate change, fracking, economic policy, reproductive freedom, and whether there really is such a thing as the hot hand in basketball. Anywhere there is disagreement, people on both sides are citing data to support their views.

And because "you can't argue with the data," we all tend to believe that if people who disagree with us had access to the same information we have, surely they would come to the same conclusion that we have. Except, of course, they don't - and they are thinking the same thing about us.

That's certainly what was going on at our dinner party. I was trying to inform my colleagues of the data so that they could come to the same reasonable conclusion as I had. They thought I was crazy. They helpfully pointed out that I had loaded the dice from the beginning by searching for "skydiving safe" (as opposed to, for example, "skydiving fatal"); by failing to question the source of the information I cited, its validity, and the possible bias of its author; and by closing my computer as soon as I got the answer I wanted. I had to concede the point.

My story illustrates the motivated reasoning of an individual making a "go/ no go" decision, the outcome of which affects nobody more so than the decider. But more complex and consequential judgments are equally susceptible to the forces of desire. Business and government leaders make decisions every day that potentially affect the lives and livelihoods of millions, and they frequently do so in the context of a preferred outcome.

Because it is an error to judge the quality of a decision process solely by the outcome of the decision, it can be difficult for an outsider to conclusively point to motivation as a factor in any particular situation. For example, imagine a person seeks a diagnostic medical test for a serious but rare condition. When the test comes back negative (that is, indicating an absence of disease), the patient accepts the results at face value and concludes she does not have the condition. Is this motivated reasoning? She decided to believe what she wanted to believe - that she is healthy - even though diagnostic tests are rarely if ever 100 percent accurate. In this case, though, even though the patient reached a favorable conclusion, there is no evidence that she ignored or manipulated data to get there. Rationality simply happened to align with desire.

In contrast, consider the Bush administration's decision to invade Iraq in 2003, justified to the public on the grounds that Saddam Hussein possessed weapons of mass destruction. When it was discovered that he did not, vocal critics of the war claimed that President George W. Bush had manufactured a reason to invade simply because he wanted to. (Theories about why he was thus motivated include avenging his father, benefitting politically, and giving free rein to hawkish tendencies.) The charge was that the administration concocted bold lies to sell to Americans.

Subsequent analysis reveals a perhaps less intentional, but no less pernicious, process of motivated reasoning. By their own report, key decision makers started with the question, "Can we make the case that WMDs exist?" and organized their decision strategies with the goal of answering yes. Among other things, those involved tended to accept at face value intelligence consistent with their theory and passed that on to the president's inner circle. In contrast, they scrutinized intelligence that contradicted their beliefs, asking the CIA to justify its methods, questioning the reliability of the sources, and at times questioning the competence and political fealty of those delivering news they did not like. Data contradicting the presence of WMDs was less likely to be passed up the line.

This group was also prone to interpreting ambiguous information in a way that would support their beliefs. When independent inspectors repeatedly failed to find evidence of an Iraqi program to produce biological, chemical, and nuclear weapons, the administration concluded not that such programs probably didn't exist, but rather that Hussein had become exceptionally clever at hiding them. They elevated unsubstantiated intelligence that Iraq was seeking Nigerian "yellowcake" to the level of undisputed fact, with Colin Powell reporting in 2002 to the House International Relations Committee that "With respect to the nuclear program, there is no doubt that the Iraqis are pursuing it." When aerial photographs of Taji revealed a configuration of buildings consistent with an active chemical munitions site, the administration concluded that that was the only thing the site possibly could be, again presenting a suggestion as fact. (No weapons were subsequently found.)

In short, Bush's political opponents accused him of manufacturing an excuse to invade - of willfully ignoring the facts. But a closer look turns up many clues that those in the Bush administration were looking very closely at the facts. They did so, however, with the goal of confirming their hunch that WMDs, their justification for war, existed. In this way, some in the group became convinced that not only were they right, but that they had also arrived at their conclusion after a thorough and unbiased accounting of evidence.

One misstep in this example, as in my own skydiving story, was the failure to engage a team of rivals early in the decision process. A potentially powerful way to counter the bias of motivated reasoning is to really listen to people who genuinely disagree with you, or at least who have no stake in your pet theory being true. They, just like my earth-bound colleagues, are most likely to expose real flaws in your reasoning that your "Can I?" approach may have led you to miss.

What you do with their insights is up to you. This summer, I'll make my 100th skydiving jump. Assuming I don't crash on the way to the dropzone.

Erica Dawson is the Interim Director of the U.S.-Israel Center on Innovation and Economic Sustainability at the Rady School of Management. The center is designed to increase cross-national and cross-disciplinary collaboration on issues related to technological, financial, and social innovation, growth, and sustainability.

For a discussion of the Bush administration's decision process regarding Iraq and WMDs, see Hersh, S.M. "The Stovepipe." The New Yorker, October, 27 2003. (Online at http://www.newyorker.com/archive/2003/10/27/031027fa_fact)

For further reading on motivated reasoning: Dawson, E., Gilovich, T., & Regan, D. 2002. "Motivated reasoning and performance on the Wason selection task." Personality and Social Psychology Bulletin 28 (10): 1379-1387.
Ditto, P.H., & Lopez, D.F. 1992. "Motivated skepticism: Use of differential decision criteria for preferred and nonpreferred conclusions." Journal of Personality and Social Psychology 63: 568-584.
Dunning, D., Meyerowitz, J. A., & Holzberg, A. D. 1989. "Ambiguity and selfevaluation: The role of idiosyncratic trait definitions in self-serving assessments of ability." Journal of Personality and Social Psychology 57: 7.

Summer 2013

Table of Contents


Editorial & Production Staff

Michael Taylor (’13), Editor-in-Chief

Elizabeth Han (’13), Managing Editor

Conor Jarvis (’14), Staff Editor

Patrick Kelly (’13), Publisher

Brian Wittmayer (’15), Business Development

Faculty Review Board

Robert S. Sullivan, Founding Dean,
Stanley and Pauline Foster Endowed Chair

Christopher Oveis,
Assistant Professor of Management and Strategy