Your Confidence Is Dangerous

A great many human problems do not begin with bad intentions. Instead, they often begin with certainty. Not justified certainty. Not earned certainty. Just the feeling of being right.

You see, we do not merely make mistakes. We often make them with conviction, and that is where things get dangerous. Just being wrong is usually recoverable, but being wrong and being highly confident at the same time is not.

Research calls this the Linda problem.

Pioneering Israeli-American psychologists, Amos Tversky and Daniel Kahneman, described Linda as bright, politically engaged, and deeply concerned with justice. They then posed an interesting question, namely: “Is it more probable that Linda is a bank teller, or that Linda is a bank teller and active in the feminist movement.

In their classic 1983 paper, 85% of respondents chose the second option, even though it is logically less probable than the first because a more specific category cannot be more likely than a broader one that contains it.

Given that the logic is not especially complex, the result is rather striking,

The real issue, though, is that people were not primarily answering with logic. They were answering with fit. The second description felt more like Linda. It matched the story better. It sounded more true, even though it was less likely to be true.

Amos Tversky and Daniel Kahneman linked this to what they had earlier called the representativeness heuristic, a mental shortcut people use to judge probability by resemblance rather than by statistical structure.

An example of this would include a polished executive who presents a confident turnaround plan full of detailed language, and the board relaxes because the story feels coherent.

Another is the political commentator who gives a sweeping explanation of a conflict with total certainty, and millions nod along because the narrative is neat.

Or what about the manager who insists a risky hire will work out because the candidate “looks like a leader?”

The problem is not that stories are useless. Human beings need stories because they help us organise complexity. They help us act.

Stories, however, become dangerous when they begin to override base rates, probabilities, and contradictory evidence. It is n these moments that we stop asking, “What is most likely?” and start asking, “What fits the picture in my head?”

Research on overconfidence shows that it is not one single thing. Don Moore and Paul Healy distinguished between overestimation, overplacement, and overprecision.

Overestimation is thinking you are better than you are. Overplacement is believing you are better than others by more than you really are. Overprecision is excessive faith in the accuracy of what you believe.

These errors in thinking make us less likely to update, less willing to seek advice, and more inclined to act as though uncertainty has already been resolved.

As Moore’s review notes, overconfidence has been described as one of the most prevalent and potentially catastrophic problems in judgment and decision-making. Kahneman himself called it one of the most significant cognitive biases.

Just look at markets, as an example. Excessive certainty about asset values often leads to excessive trading, poor risk management, and a willingness to dismiss what others know.

Or look at leadership. Research by Fast and colleagues found that power often increases overconfidence in the accuracy of one’s knowledge. This is important because many high-impact decisions, such as acquisitions, market entry, and partnerships, are made by people who are rewarded for appearing decisive.

Confidence, therefore, becomes part of the costume of authority. The problem, though, is that the costume then starts shaping the mind inside it.

People with the most power are often expected to project certainty. And yet, certainty is exactly what complex environments should make us suspicious of.

In reality, high-quality thinking in uncertain conditions should not be loud but calibrated. It should recognise ranges, alternatives, and the possibility of error.

Philip Tetlock’s work on political forecasting found that experts are often much less reliable than their confidence suggests. Later forecasting research showed that even relatively brief debiasing training improved accuracy in geopolitical forecasting tournaments by 6-11% in Brier scores.

It is an important finding because it suggests the problem is not just a matter of intelligence but one of judgment and discipline.

The most dangerous bias may not be simple ignorance, as ignorance at least leaves room for caution. The more dangerous condition is misplaced confidence, because it feels like knowledge while behaving like blindness. It removes the friction that uncertainty is supposed to create and turns assumptions into action too quickly.

Many people have experienced this in relationships. Someone becomes certain about another person’s motives and stops checking whether they are right. A strained conversation becomes a prosecution.

At work, a leader becomes convinced that an employee is “not leadership material,” and every future behaviour is interpreted through that lens. In families, one old story about who is responsible, selfish, weak, or difficult can harden into something that feels factual simply because it has been repeated for years.

So how can we use this information?

Step 1: Become suspicious of the relief that certainty brings.

There is comfort in having a simple answer, such as a named culprit, a tidy forecast, or a story that explains everything. But these are not proof in and of themselves. Psychological comfort and the truth are not the same thing. The Linda problem demonstrates how easily we trade one for the other.

Step 2: Force yourself to ask questions.

Ask what broader category contains the narrower one. Ask what the base rate is. Ask what would have to be true for the opposite conclusion to make sense. Ask what information is missing, not just what information is present.

Moore and colleagues note that overconfidence is often driven by neglecting the unknowns. We tend to anchor what we can see and become too sure because the unseen never makes it into the room.

Step 3: Be authentically humble.

Real humility is not saying, “I could be wrong,” and then behaving as though you could not possibly be wrong.

Instead, it is using probabilities rather than absolutes when absolutes are not warranted. It is inviting disagreement before consequences arrive. It is being willing to look less certain in the short term so that you can be less wrong in the long term.

Closing Thoughts

Our culture often rewards confidence long before it rewards accuracy.

The person who speaks crisply is assumed to think clearly. The person who speaks confidently is rewarded, and those who hesitate are doubted, even when that hesitation reflects care and intellectual honesty.

So we train ourselves.

We learn to sound sure, then gradually forget the difference between sounding sure and being correct. Confidence in our own intuition often outruns the evidence before us.

However, once you know that the mind often confuses plausibility with probability, you can start building better habits. You can slow down. You can separate confidence from correctness. You can ask better questions. You can become harder to fool, especially by yourself.

Until next time, remember that the biggest threat to good judgment is not the fact that people are emotional, irrational, or uninformed. Instead, it is that they can be wrong in ways that feel completely reasonable from the inside.

The danger begins when a flawed judgment arrives wrapped in the sensation of truth.

Dion Le Roux

References

  1. Chang, W., Chen, E., Mellers, B., & Tetlock, P. E. “Developing expert political judgment: The impact of training and practice on judgmental accuracy in geopolitical forecasting tournaments.” Judgment and Decision Making (2016).

  2. Fast, N. J., Sivanathan, N., Mayer, N. D., & Galinsky, A. D. “Power and overconfident decision-making.” Organisational Behaviour and Human Decision Processes (2012).

  3. Moore, D. A., & colleagues. “Overprecision in Judgment.” Chapter manuscript/review hosted at learnmoore.org.

  4. Tetlock, P. E. Expert Political Judgment: How Good Is It? How Can We Know? Princeton University Press material/search result.

  5. Tversky, A., & Kahneman, D. “Judgment under Uncertainty: Heuristics and Biases.” Science (1974).

  6. Tversky, A., & Kahneman, D. “Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment.” Psychological Review (1983).

  7. Veritasium. “Why People Are So Confident When They’re Wrong.” YouTube video.

Next
Next

Wellness Isn’t A Lifestyle