•UNTANGLE YOUR BIGGEST WORK CHALLENGES WITH A FREE ACCOUNT • START HERE

One Bias to Rule Them All

Engineering school, plus a few decades in the medical device industry, has given me ample reason to spend lots of time thinking about accuracy and precision.

Most of us will, at some point in our lives, find ourselves on the receiving end of one of those medical devices. When we do, we want to feel confident that the measuring instruments used to manufacture it were accurate.

The medical device in question might even be a measuring instrument itself, from something as simple as a thermometer to the wildly complex visualization systems that locate catheters inside the heart to treat life-threatening arrhythmias.

It was in the context of these sorts of critical measurements that I first encountered the idea of bias. In a measuring instrument, bias is a consistent and predictable deviation from truth.

Simple example: The thermometer consistently reads two degrees low. You thought the patient's temperature was 98.6°F, when it was actually 100.6°F.

That’s bias.

A Different Kind of Bias

These days, the biases I’m interested in are of the cognitive variety, but it’s the exact same thing.

Bias is just a special kind of wrong.

A cognitive bias is simply a built-in, systematic, predictable error in thinking. Cognitive biases are artifacts of our evolution and culture, and there are many of them.

Often, they once provided some slight survival advantage. “When in doubt, assume lion in the bushes, not mouse” is a bias that saved our ancestors precious milliseconds.

That Negativity Bias once helped us evade deadly teeth and claws, at the relatively minor cost of unnecessarily fleeing a lot of harmless mice.

In today’s much safer world, that same built-in negativity bias mostly just makes us neurotic, just as Ingroup-Outgroup Bias makes us xenophobic. 

Although our many cognitive biases are no longer as useful as they once were, they are here to stay, so we must learn to manage them.

Bordering on Delusional

Nobel Laureate Daniel Kahneman shares an example of a bias that is particularly relevant in the business world in his book, Thinking Fast and Slow.

In the mid-1970s, Kahneman convinced the Israeli Ministry of Education that judgment and decision making should be taught in high schools. He assembled a team to design the curriculum and write a textbook for the class.

A year into the project, Kahneman asked each of the team members how long they thought it would take to complete the project.

He had them write their answers down instead of speaking them aloud to avoid Groupthink and the Anchoring Bias, where the first number people hear heavily influences, or anchors, their estimates.

(Asking people to write down their answers first is highly recommended for team meetings when you want to know what people really think, instead of what they think other people think they should think).

The responses were tightly clustered around two years. No one thought less than eighteen months. No one thought more than thirty.

The Reference Class

Kahneman then asked Seymour Fox, dean of the Hebrew University School of Education and a highly experienced curriculum developer, if he had been involved in similar projects.

SF: Yes, lots!

DK: Great! How did the other groups do?

[Awkward silence.]

SF: Ummm... forty percent actually didn’t finish at all.

DK: What about the others?

SF: All the rest took between seven and ten years.

[Uh oh.]

DK: How does our team compare in terms of expertise and experience?

SF (without hesitation): Below average.

Make Like an Ostrich

Kahneman’s team had just been told that, on similar projects, other, more experienced teams had taken much longer to publish their books, if they finished at all.

Not twenty or even fifty percent longer than Kahneman’s team’s worst-case estimate of two and a half years, but three to four times as long.

So, with that reality check, what did they do?

They quickly set that information aside and moved on as if nothing had happened. They finished the book eight years later, just like the others.

Kahneman later noted that their original estimates, of which they were highly confident and in close agreement were, "bordering on delusional."

A Bias Is Born

Kahneman called the bias that he and his colleagues fell prey to the “Planning Fallacy.” 

The Planning Fallacy arises from:

  1. Failing to consider the actual results of similar cases (looking only at the “inside view” and never the “outside view”), and
  2. Believing that a plan is realistic when it is usually closer to a best-case scenario because of what Kahneman calls WYSIATI–what you see is all there is. We ignore the reality of unknown unknowns.

If you have ever done any project of any kind, you have almost certainly experienced the Planning Fallacy firsthand.

No One Is Exempt

I chose this example partly because it is so prevalent, but also because it happened to Daniel Kahneman himself, the guy who won a Nobel Prize for his work on biases and decision errors. 

In behavioral economics, Kahneman, who died earlier this year, was a towering figure. It is on his shoulders, and those of his research partner Amos Tversky, that all subsequent behavioral economists stand.

If it can happen to Kahneman, it can happen to anyone.

To believe otherwise is its own uber-bias, yet that is exactly what we do.

One Bias to Rule Them All

The Bias Blind Spot is the belief that you are less subject to biases than others, and that awareness of a bias renders you immune to it.

As if to prove its own validity, one survey found that eighty-five percent of people believe they are less biased than average (see the problem with that math?).

Understanding cognitive biases is a necessary, but not sufficient, step in avoiding them.

Additional action is required.

The Specific Case

In the case of the Planning Fallacy, one specific cure is called “reference class planning.”

The first step is to identify a “reference class” of similar cases to establish a baseline forecast for cost, timing, resource requirements, etc. The average of the reference class is then the anchor, and the plan is adjusted up or down around that anchor only if such adjustments are justified.

The reference class provides the outside view to balance our default inside view.

The outside view shows us that WYSINATI–what you see is NOT all there is.

The General Case

Biases are one manifestation of the “thinking fast” part of Thinking Fast and Slow.

Thinking fast, without the involvement of your big, slow, energy-burning prefrontal cortex is essential. We would not be able to function without the fast-thinking part of our brains, but we must also learn to question it, because it is full of biases.

Reference class planning gives us a specific example of how we can beat those biases.

But the only real antidote to cognitive biases in general is the hard work of thinking slow.

Thinking slow means understanding the cognitive biases at play, accepting that you are still vulnerable to them even though you know they are there, and finding safeguards like reference class planning to counteract them.

Check Your Mirrors

It’s easier said than done. It requires working against our own wiring, and taking a path that is cognitively harder. It is doubly difficult when the stakes are high and strong emotions are involved. Thinking slow under pressure will challenge anyone’s emotional intelligence, but it can be done.

Avoiding biases, that special kind of wrong, requires considering the outside view, not just the inside view. It demands that we take a breath and slow things down, and it’s something we help our coaching clients with all the time.

A good coach becomes a mirror, and that matters, because no one can see their own blind spots.

Until next time,
Greg

Do you know someone who might benefit from this blog post?

If you like what you've read here, please forward it to a friend. It's a low-risk way to show them a little of who you are, and it might inspire them to return the favor. Everybody wins:-)

Want to stay connected?
Join the Retexo newsletter!

If you enjoy our blog, you may want to join our newsletter, Untangled, and read our weekly posts even before they are published online. Conveniently delivered to your inbox once a week.

We promise never to share your name or email, and you may unsubscribe at any time.

Executive Coaching, Corporate Training, and Group Facilitation

Privacy Policy • Terms

Executive Coaching, Corporate Training, and Group Facilitation

Privacy Policy • Terms