In Defense of the Devil’s Advocate

8 minute read
Ideas
Edmans is Professor of Finance at London Business School. His TED talk "What to Trust in a Post-Truth World" has been viewed two million times. His latest book is May Contain Lies: How Stories, Statistics, and Studies Exploit Our Biases–And What We Can Do About It

My heart was beating out of my chest, and I was breaking into a cold sweat. You’d expect nerves from a Ph.D. student presenting at a conference full of professors, but I’d just finished my talk. I’d practiced that presentation dozens of times and never quite nailed it. Yet gameday had come, and I’d hit every note. So why the jitters?

As I took my seat, Patrick Bolton stood up. He was one of the world’s most respected researchers in corporate finance theory, the topic I’d just spoken about, and would later become president of the American Finance Association, the most prestigious position in my profession. Patrick wasn’t presenting one of his own papers but instead was assigned the role of “discussant”—to read my paper beforehand and give an independent view.

The discussion started well enough, with Bolton calling my idea “intuitively plausible,” but then he said a key ingredient in my study “makes no sense.” I felt like I’d been punched in the stomach. I’d been thrilled to be invited to this conference, where I was the only student in a room full of professors. But now they’d be going back home and telling their colleagues about this young upstart who gate-crashed an event for seasoned faculty and presented nonsense.

Bolton then moved on to discussing the next paper in the session, but I was so upset I tuned it out. This research was by a senior professor, so Bolton would surely be full of praise, making mine seem even more flawed by comparison. I was suddenly shaken out of my slumber when Patrick said that this study might have an “endogeneity problem.” That’s a major blow to any scientific study. It means that you have correlation, but no causation.

That was the first morning of the week-long conference. Virtually every other discussion followed a similar tone to Bolton’s. It started off commending the question the researchers were exploring but then explained why they hadn’t yet fully nailed the answer due to alternative explanations or other quibbles. Over that week, I realized that constructive criticism is simply part of the academic process. The whole point of presenting at a conference is that you can only take an idea so far by yourself. There’s no stigma in receiving negative comments—they’re simply expected. If a discussant were ever entirely positive, it would have so little credibility that the audience would think you had incriminating photos of him.

The value of this practice applies far beyond academia. While most people understand the power of the scientific method, this is about the “scientific culture”—an environment where people put out bold and innovative ideas, actively seek dissenting opinions and revise their proposals to address the criticisms—which is valuable to any organization. If it’s part of the fabric for plans to be critiqued, then there’s no shame in receiving pushback. Nor is there fear in raising concerns—doing so helps colleagues refine their ideas, rather than stabbing them in the back. Highlighting flaws isn’t unkind; instead, one of the most unkind things you can do is to notice a problem and not point it out.

Read more: How to Disagree With Your Boss and Still Get Ahead

The non-academic equivalent of a discussant is a “devil’s advocate,” who highlights the blind spots in a proposal. Sometimes, an entire group is tasked with this job, known as a “red team.” This was practiced by ExComm, the committee set up by John F. Kennedy to indicate how to respond to the Cuban Missile Crisis. It was divided into two teams, one supporting an invasion and the other a blockade. After the two sides wrote their initial papers, they exchanged them and gave feedback on the other’s proposal. The initial group then revised their plan to take into account the concerns.

Sometimes you don’t need to appoint a red team; the culture is such that one naturally emerges. When he ran General Motors, Alfred Sloan closed a meeting by asking “I take it we are all in complete agreement on the decision here?” Everyone nodded. Sloan continued, “Then, I propose we postpone further discussion of this matter until our next meeting to give ourselves time to develop disagreement and perhaps gain some understanding of what this decision is about.” He believed that no decision is black and white, and if no one raised any concerns, this wasn’t because there weren’t any but because he hadn’t yet given his colleagues time to think of them.

Devil’s advocates can even be automated. Singaporean bank DBS brings a Wreckoon, a racoon-themed mascot wielding a hammer, into every major meeting. At random times, a PowerPoint slide appears with the Wreckoon, accompanied by a question such as “What have we missed out?,” “What is our riskiest assumption?,” “What could go wrong?” and “Where is the data?” This prompts leaders to pause and give airtime to dissenting views.

Knowing that criticism will come your way drives you to make your idea as strong as possible beforehand. Researchers will do all they can to pick holes in their own paper before sending it to a discussant. This practice is known as a pre-mortem. In a post-mortem, a decision has flopped and you try to figure out why. In a pre-mortem, you imagine that a failure has occurred and think about all the possible causes.

A final feature of a scientific culture is the value given to dissenting voices. You might think it strange that people ever agree to be a discussant—you fly halfway around the world to be the bad guy in the room—but the profession greatly respects members who give tough but constructive evaluations. Doing so boosts their reputation, and many conferences give “best discussant” awards.

Read More: How to Actually Change Someone’s Mind

Some companies aim to foster such a culture. X, Google’s moonshot factory, gives a bonus to any employee who finds a fatal flaw that leads to their own team’s project being killed. This in turn inspires X’s engineers to be yet more daring—if they propose a crazy idea that has a fundamental defect, they’re confident that a colleague will notice it and scrap the innovation before it costs the company millions of dollars. The better a car’s brakes, the more you can push on the accelerator.

Yet not every company values dissent. In May 2022, Stuart Kirk, HSBC’s Head of Responsible Investment, gave a speech arguing that investors needn’t worry about climate change. This talk led to instant outrage, but the content was more nuanced than the headlines suggested. He pointed out that, even if the planet becomes warmer, we can invest in adapting to higher temperatures. Nor did he say that climate change isn’t a serious threat to society but rather that investors don’t bear the risks as their horizons are too short-term. HSBC suspended him, even though they’d previously signed off on the content of his talk, and the furor led to him resigning shortly afterwards.

Kirk’s delivery was sometimes sardonic, with the most-quoted line being “Who cares if Miami is six meters underwater in 100 years? Amsterdam has been six meters underwater for ages, and that’s a really nice place. We will cope with it.” However, controlling our emotions about the tone and focusing instead on the content, the speech did an important service by providing a contrasting opinion: that we’re focusing almost exclusively on climate-change mitigation and not enough on adaptation, and that investors won’t worry enough about climate change until regulators make them pay the price through carbon taxes. Suspending someone for expressing a dissenting view, even on a topic we might feel strongly about, is a deterrent to diverse thinking.

Companies are paying increasing attention to diversity under the assumption that diverse teams make better decisions. But it’s not enough to take an “add diversity and stir” approach, where a company simply hires a mix of people and leaves them to work their magic. It needs to take deliberate steps to foster a diversity of thinking. Appointing devil’s advocates, holding pre-mortems, and encouraging different viewpoints are valuable tools to building smart-thinking organizations.

Reprinted with permission from May Contain Lies: How Stories, Statistics, and Studies Exploit Our Biases—And What We Can Do about It by Alex Edmans, courtesy of the University of California Press. Copyright 2024.

More Must-Reads from TIME

Contact us at letters@time.com

TIME Ideas hosts the world's leading voices, providing commentary on events in news, society, and culture. We welcome outside contributions. Opinions expressed do not necessarily reflect the views of TIME editors.