Please Tell Me I'm Wrong
Some leaders tell their followers to prove them right: others dare them to prove them wrong
How often have you heard a leader say this—genuinely? “Prove me wrong”.
Not rhetorically. Not as theatre, not as a challenge issued only after the decision has already been made.
I mean sincerely: “Prove me wrong.”
I can tell you how many times I’ve heard it. Never.
Hypothetically, yes. From “gurus”, fine. But from a real person, in the course of normal business? Never.
Spend enough time around leadership teams and you notice something uncomfortable. Many leaders claim they welcome challenge. Few actually structure their decision processes to receive it.
Psychology offers a clear explanation for why. And the difference might be the one between mediocre and high-performing.
The Structural Pull Toward “Prove Me Right”
Human judgment is not naturally impartial. One of the most robust findings in cognitive psychology is confirmation bias—the tendency to seek, interpret, and recall information in ways that confirm existing beliefs while discounting contradictory evidence.1
In everyday situations this bias is manageable. In leadership contexts it becomes amplified.
Authority changes the informational environment. Subordinates may hesitate to challenge their boss. Meetings drift toward agreement rather than interrogation. Organisational cultures begin rewarding alignment rather than dissent. Not because of cowardice but because of human psychology.
Over time leaders may unknowingly receive a filtered stream of information—one that increasingly supports their prior judgments. When this supports good outcomes, this is wonderful. When it supports bad ones, it leads to outcome bias.
Irving Janis described this phenomenon as groupthink, where pressures toward consensus suppress the critical evaluation of alternatives.2 Teams become less likely to question assumptions, explore failure scenarios, or surface inconvenient facts. This is great — when things are simple.
But there re time in such environments leaders rarely hear “you may be wrong.”
Not because the leader is always correct—but because the system quietly discourages contradiction.
The Discipline of Disconfirmation
One of the central disciplines of judgment is therefore the active search for disconfirming evidence.
Research on organisational learning illustrates why this is difficult. Chris Argyris and Donald Schön distinguished between single-loop learning—adjusting actions while preserving underlying assumptions—and double-loop learning, where those assumptions themselves are questioned.3
Double-loop learning is cognitively demanding. It is even harder in leadership roles, where authority, reputation, and identity are often tied to being right.
Studies on defensive reasoning show that individuals frequently protect existing beliefs by rationalising errors or reframing criticism as illegitimate.4 The mind quietly moves to defend the current narrative.
Which is precisely why leaders must deliberately build mechanisms that challenge their thinking.
Double-loop learning is hard. For all of us. But if we want to be high-performing decision makers we must ask ourselves, “Am I confirming my desire? Engaging in a single loop? Have I considered double loops?”
The Role of Red Teams
One of the most effective tools for doing this is the red team.
A red team is a structured group tasked with critically challenging plans, assumptions, and decisions before they are implemented. Rather than trying to prove a plan correct, the red team’s role is the opposite: to expose how it might fail.
In other words, a solid leader won’t just seek confirmation that their idea is right. They’ll commission a team to prove them wrong.
Research on structured dissent supports this approach. Gary Klein’s work on the premortem technique demonstrates that asking teams to imagine a project has already failed—and then identify the reasons why—substantially increases the identification of risks and potential failure points.5
Similarly, studies of forecasting and decision-making suggest that structured adversarial analysis improves judgment accuracy by forcing decision makers to evaluate competing hypotheses rather than defend a single narrative.6
The insight is straightforward: good judgment rarely emerges from agreement alone.
It emerges from disciplined challenge.
SEAL CULTURE
You’d expect that in the military, it’s a “do as I say” attitude, right?
But actually in the Navy SEALs—often discussed publicly by retired officer Jocko Willink—junior personnel are expected to raise concerns if they believe a plan contains flaws before execution begins. In high-risk environments, failing to surface critical information can have serious consequences, so questioning assumptions is treated as part of professional responsibility rather than insubordination.
Research on naturalistic decision making (NDM) shows that experienced teams improve decision quality when members share different perspectives and challenge assumptions before action is taken (Klein, 1998; Klein, Calderwood, & Clinton-Cirocco, 2010).
This principle also appears in studies of high-reliability organisations, where safety and effectiveness depend on cultures that encourage people to speak up about potential problems (Weick & Sutcliffe, 2007).
The lesson translates easily beyond the military: strong teams do not avoid disagreement. Instead, they create environments where concerns can be raised early, ideas can be tested, and decisions become stronger before they are put into practice.
The Leader’s Psychological Test
Yet the effectiveness of these methods depends on something deeper than process. It depends on the leader.
Even when formal mechanisms for dissent exist, people quickly learn whether challenge is truly welcome. Leaders who react defensively—subtly or overtly—signal that criticism is career risk rather than intellectual contribution.
Research on psychological safety shows that teams perform better when members believe they can raise concerns or admit mistakes without fear of punishment or humiliation.7 In such environments information flows upward rather than being suppressed.
But psychological safety is fragile. It is created not by slogans, but by repeated leader behaviour. When criticism is met with curiosity rather than defensiveness, dissent becomes normalised.
When criticism is punished—even subtly—it disappears.
Wisdom Over Being Right
There is a deeper leadership question beneath all of this.
What matters more: am I being right, or achieving results?
Leaders who attach their identity to being correct often resist contradiction. Leaders who attach their identity to the mission welcome information that improves outcomes—even when it challenges their own ideas.
From the perspective of judgment science, the latter is far more adaptive.
Studies on intellectual humility suggest that individuals who recognise the limits of their knowledge engage more effectively with conflicting information and make more accurate judgments over time.8
In other words, the strongest leaders are not those most certain of their own correctness.
They are those most committed to finding the truth, even when it challenges their own thinking.
The Real Test of Leadership
Every leader says they want honest feedback. Few truly mean: prove me wrong.
Yet responsible leadership demands exactly that. Complex decisions rarely fail because leaders lacked confidence. They fail because leaders lacked contradiction.
The discipline of judgment requires something uncomfortable: the willingness to invite challenge, examine assumptions, and design systems that expose errors before reality does.
Red teams do not weaken leadership.
They strengthen it.
Because the goal of leadership is not to protect the leader’s beliefs.
It is to ensure the organisation makes the best possible decisions—especially when the leader might be wrong.
Footnotes
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220.
Janis, I. L. (1972). Victims of Groupthink: A Psychological Study of Foreign-Policy Decisions and Fiascoes. Houghton Mifflin.
Argyris, C., & Schön, D. (1978). Organizational Learning: A Theory of Action Perspective. Addison-Wesley.
Argyris, C. (1991). Teaching smart people how to learn. Harvard Business Review, 69(3), 99–109.
Klein, G. (2007). Performing a project premortem. Harvard Business Review, 85(9), 18–19.
Tetlock, P. E., & Gardner, D. (2015). Superforecasting: The Art and Science of Prediction. Crown.
Edmondson, A. C. (1999). Psychological safety and learning behavior in work teams. Administrative Science Quarterly, 44(2), 350–383.
Krumrei-Mancuso, E. J., & Rouse, S. V. (2016). The development and validation of the Comprehensive Intellectual Humility Scale. Journal of Personality Assessment, 98(2), 209–221.
