Sen. Dianne Feinstein, D-Calif., Chairwoman of the Senate Intelligence Committee, talks with reporters after sharing a report on the CIA and it's torture methods, December 9, 2014.
Tom Williams—CQ-Roll Call,Inc.
December 16, 2014 12:57 PM EST

Matt Motyl is an assistant professor at the University of Illinois at Chicago and a political psychologist who studies group conflict.

If you know a person’s politics, you can make an educated guess about their views on the events in Ferguson, Eric Garner, the alleged rape at the University of Virginia, and a variety of CIA “dark sites” around the world. Of course, people will justify their condemnation (or praise) of the CIA’s “enhanced interrogation techniques” by claiming that torture doesn’t produce reliable evidence (or that it does). But what’s the relationship between their moral evaluation of the practice and their belief in its efficacy? For most people, the evaluation comes first, and it leads to their beliefs about whether or not torture “works.”

Social psychologists have studied this process for decades. In 1979, Charles Lord, Lee Ross and Mark Lepper asked Stanford undergraduates to evaluate the quality of two scientific studies examining the effectiveness of capital punishment in deterring future crime. These scientific studies used the same methods but were evaluated in strikingly different ways by those who supported and opposed capital punishment. You’d think that people exposed to a range of findings would be pulled toward the center, but in fact they ended up further apart than when they began. People drew support from their favored study and dismissed the other one.

More recently, Yale University professor Dan Kahan conducted an experiment where he gave participants profiles of experts on climate science, nuclear-waste disposal and concealed-carry gun laws. All these experts had advanced degrees from the world’s foremost universities and held prestigious jobs in their relevant fields. After participants read the expert profiles, Kahan presented them with the experts’ conclusions that either supported or refuted the participants’ views. When participants were asked to evaluate the experts’ scholarly credentials (remember that all these authors had similarly remarkable academic bona fides), what Kahan found was that participants viewed scientists as experts only if they confirmed the participants’ pre-existing beliefs. Both these studies, and dozens more like them, suggest that people apply different standards to evidence that supports their views than to evidence that challenges their views. We are prone to uncritically accept arguments and information that confirm our view while unfairly rejecting arguments and information that challenge our view—regardless of what our view is.

Consider the challenge of sifting through thousands of pages of documentation on the effectiveness of the CIA’s brutal interrogation tactics in generating actionable intelligence. How might you respond if you were presented with evidence that challenged your view on torture? If you oppose torture and saw evidence suggesting that it does result in high-quality actionable intelligence, would you change your position and come to support the use of torture? If you support the use of torture in some situations and saw evidence suggesting that it never results in any quality intelligence, would you change your position and come to oppose the use of torture? Odds are you would discount the information that challenged your views on torture.

In part, this is because torture is a moral issue, and we are especially motivated to preserve faith in the truth of our moral beliefs. Research by social psychologists Peter Ditto and Brittany Liu demonstrates that people’s moral beliefs shape their interpretation of facts. Specifically, they asked more than 1,500 people in a survey at how moral or immoral forceful interrogations were and how likely those forceful interrogations would be to yield positive consequences, like actionable intelligence. They found that people who believed torture was inherently immoral assumed that any information gleaned from it was likely to be unreliable. On the other hand, people less squeamish about the morality of torture assumed that information gleaned from torture was potentially life-saving. In a related experiment, they found that when people were led to think of a political policy in moralistic terms, it changed their beliefs about the policy’s costs and benefits to fit with their moral view; the more morally desirable the policy, the more effective it was seen to be. In other words, people’s beliefs about the morality or immorality of torture biased their interpretation of the facts regarding torture’s (in-)effectiveness.

Former Senator Daniel Patrick Moynihan famously said, “You are entitled to your own opinion, but you are not entitled to your own facts.” Research, though, suggests that it is not so easy to separate fact from belief. And our beliefs change what “facts” we decide count as facts. Furthermore, once we decide something is a moral issue and that our position owns the moral high ground, facts become less relevant. If they do not confirm our belief, we assume that the facts were produced by people who were biased by some ulterior motive. For example, Republican Senators Saxby Chambliss of Georgia and Mitch McConnell of Kentucky, who generally supported the use of enhanced interrogation techniques and opposed the declassification of the CIA torture memo, responded in a joint statement saying that the “study by Senate Democrats is an ideologically motivated and distorted recounting of historical events.” In contrast, President Obama, who generally opposed the use of enhanced interrogation techniques, responded to the release of the report by saying it “reinforces my long-held view that these harsh methods were not only inconsistent with our values as a nation, they did not serve our broader counterterrorism efforts or our national-security interests.” In other words, supporters and opponents of torture, or conservatives and liberals alike, exhibited the same motivated cognitive bias where they evaluated information in ways to confirm their beliefs.

In conclusion, the human brain is built to evaluate evidence in biased ways. If information fits with our moral values, we are quick to accept that evidence as strong and true, furthering our belief that we are correct and that anyone who disagrees with us is wrong and probably immoral. If the information contradicts our moral values, we are quick to discount that evidence as flawed or biased by the nefarious ideological motives of others. This tendency is especially true when the evidence is complex and ambiguous, as is the case with the Senate’s CIA torture report and with the conflicting testimony of dozens of witnesses interviewed in grand jury hearings in the Darren Wilson case in Ferguson.

So the next time you’re debating torture or any other contentious political issue — climate change, genetically modified foods, hydraulic fracking or the invisible hand of the free market — remember that your opinion is just as biased as the person you are debating and that your beliefs may not be based on facts. Rather, your facts may be based on your beliefs. And that goes for the other side too.

Motyl is an assistant professor at the University of Illinois at Chicago and a political psychologist who studies group conflict

More Must-Read Stories From TIME

Contact us at

Read More From TIME
You May Also Like