Hindsight bias reflects your tendency to view the outcome of an event as inevitable. It’s your belief you could have accurately predicted an outcome after knowing what the outcome is.
A few years ago I had the worst night of camping of my life.
We arrived at Lopez Island, off the coast of Washington state, late in the evening, so we had few campsites to choose from. We selected a site on a bluff overlooking the Strait of Georgia and the Olympic Mountains—a great view. Why had no one thought to claim the site?
We found out why a few hours later when a storm blew through the island. The rain soaked all our belongings. The wind kept us awake all night and destroyed our tent.
When the sun came up, it was obvious why nobody else had selected our site. What looked like the perfect location with a great view turned out to be a terrible location during a storm, unprotected from the wind and the rain.
How could we not have seen it coming?
The warning signs were there:
- Nobody else had selected the campsite. Maybe there was a good reason.
- The bluff was exposed to the wind.
- Sleeping out in the open means less protection from the rain.
However, consider this:
- There are almost never windstorms in Washington in the summer. Why should we have expected one?
- Most tents are designed to withstand the wind fairly well. It wasn’t the first time I’d camped in the wind.
- Most tents are designed to keep the rain out. Camping in the rain is pretty normal in the northwest.
- There could have been other factors why our campsite was unoccupied when we arrived. It was the farthest from the dock; most people don’t want to haul their gear to the farthest site and bypass several acceptable sites along the way.
With the benefit of hindsight, we obviously made a bad decision.
But in the moment, we probably made a good decision. People may have chosen that campsite for the previous thirty days without a problem. The night I chose the campsite happened to be the night we had a storm.
Now that I know the outcome, it’s difficult for me to understand my decision in an objective way. The outcome of that decision clouds my judgment of the decision-making process.
Hindsight bias is everywhere
You’ve probably experienced hindsight bias, too.
Your decision not to buy Microsoft, Google, or Apple stock years ago, or Bitcoin months ago, might seem irrational in hindsight, even though it was probably entirely rational at the time.
Your view of Trump’s election to the presidency affects how you interpret the events of the election.
If you’re a Cubs fan, you probably can’t conceive of a reality that didn’t involve them winning the World Series in 2016. It was their time.
In 2007, Victor Keegan wrote an article in The Guardian titled, “Will Myspace ever lose its monopoly?”[1] In hindsight, that question—and the fact that the article doesn’t mention Facebook—sounds absurd. But at the time, it was a reasonable prediction.
Hindsight bias in the courtroom
Hindsight bias is especially dangerous in the courtroom because jurors and judges know the outcome before they hear a word of testimony. The reason they’re there is because a crime has been committed; now they need to review the evidence for who may have done it.
What can look obvious in hindsight—it’s obvious the defendant knew what he was doing—might not have been so clear-cut in the moment.
Take these examples from the handbook, Jury Selection:
- “Even though he was a kid, he should have known that once he showed the others who had been drinking that he had a gun, things would get out of hand. He should have known guns invited violence.”
- “I can’t understand why the managers didn’t try to get more information or use the information they had available. They should have known there would be safety problems at the plant.”
- “It should have been obvious to everyone that things weren’t going well. Warnings began to appear on the web and in letters to the editors about the real estate bubble. If even people outside the securities and economics saw the recession coming as early as 2007, why didn’t Wall Street recognize the signs? Greed, that’s why.”[2]
One study evaluated a case involving a dangerous stretch of railroad tracks where an accident had occurred. The jury was asked to determine whether the railroad was negligent in continuing to operate on the tracks. They were asked to award compensatory damages to the victims based on their injuries. However, when the jury learned the railroad accident also spilled toxic chemicals into a nearby river, they were twice as likely to make the railroad pay damages, even though in both cases—one with the toxic spill and the other without—the injuries were the same. The environmental spill had no effect on the level of injuries, yet simply knowing a different outcome made the jury perceive more ill intent on the part of the railroad When a toxic spill was involved, juries viewed the event as more foreseeable and thus awarded greater damages for injuries.[3]
Other studies have found that juries view searches of private property more permissible if something illegal is found. Searching someone’s private property without a warrant is illegal not because of what’s there (or isn’t there), but simply because it’s someone’s private property.
Researchers asked mock jurors to award damages to the victims of an illegal police search. Groups of jurors were presented with three scenarios:
- An illegal search where drugs were discovered.
- An illegal search where no drugs were discovered.
- An illegal search with no known outcome.
In the scenario where drugs were found and the defendant received a guilty charge, jurors awarded less damages to compensate for the illegal search, even though the legality of the search had nothing to do with the outcome. Jurors couldn’t help but let the outcome influence their perception of intent.[4]
You won’t go far in the legal system without encountering hindsight bias in some form or another.
Hindsight bias in politics
Winston Churchill defined politics as “the ability to foretell what is going to happen tomorrow, next week, next month and next year. And to have the ability afterward to explain why it didn’t happen.”
After the 2012 presidential election between Barack Obama and Mitt Romney, pundits who predicted a Romney landslide were left explaining themselves. Perhaps most famous of these was Dick Morris.
After the election, Morris explained why his prediction had been incorrect:
I derided the media polls for their assumption of what did, in fact, happen: That blacks, Latinos, and young people would show up in the same numbers as they had in 2008. I was wrong. They did. But the more proximate cause of my error was that I did not take full account of the impact of hurricane Sandy and of Governor Chris Christie’s bipartisan march through New Jersey arm in arm with President Obama. Not to mention Christie’s fawning promotion of Obama’s presidential leadership. It made all the difference.
…Sandy, in retrospect, stopped Romney’s post-debate momentum. She was, indeed, the October Surprise. She also stopped the swelling concern over the murders in Benghazi and let Obama get away with his cover-up in which he pretended that a terrorist attack was, in fact, just a spontaneous demonstration gone awry.
Morris blames his incorrect prediction in part on the joint efforts between Barack Obama and Chris Christie to help New Jersey recover from Hurricane Sandy. But Morris knew just as much about Obama and Christie’s work before the election as he did after the election. Only after the election did this become a relevant factor in predicting the outcome.
Before we blame Morris for his error, it’s worth keeping in mind that we make the same mistake all the time. If you closely followed the 2016 United States presidential campaign, it’s difficult to view any of the events—the email server, the “Mexican rapists” comment, or the Access Hollywood tape—through the lens of a Clinton victory. No matter how outlandish the campaign, the outcome of a Trump victory informs how we construct a narrative about how he won.
Hindsight bias exists in elections of all kinds. In October 1991, the United States Senate held a confirmation vote for Clarence Thomas, who was appointed to the Supreme Court. On the day before the vote, people were asked to make predictions about the outcome, as well as how confident they were in their predictions. 76.8% of subjects predicted Thomas’s confirmation, but one month later, 79.3% recalled predicting it. The day before the vote, subjects were 67.6% sure of their predictions, but one month later, they are 70.4% sure.[5]
Hindsight bias in predicting unlikely events
Hindsight bias also affects how people view unlikely events, such as terrorist attacks and natural disasters. In the bipartisan U.S. House committee report on the preparation for and response to Hurricane Katrina, the phrase “should have” occurs 42 times.[6]
Many of the recommendations appear obvious in hindsight but would have appeared outlandish and extreme in foresight. For example:
- “Perhaps the single most important question the Select Committee has struggled to answer is why the federal response did not adequately anticipate the consequences of Katrina striking New Orleans.”
- “It remains difficult to understand how government could respond so ineffectively to a disaster that was anticipated for years, and for which specific dire warnings had been issued for days. This crisis was not only predictable, it was predicted.”
In fact, Tom Davis, the Chairman of the committee, stated in a hearing: “That’s probably the most painful thing about Katrina, and the tragic loss of life: the foreseeability of it all.”
The American public agreed. In an October 2007 Gallup poll, 62% of respondents thought the United States government—and especially the Army Corps of Engineers—should have done more to prevent the damage caused by Hurricane Katrina.
However, two presenters at a conference held at Louisiana State University covering the ten-year anniversary of Katrina offered a reminder that what appears obvious in hindsight may not have been so obvious in foresight:
Given the evidence, it would be hard to argue that the authorities did not take Katrina seriously, or that they failed to prepare and warn the local populations. These preparatory efforts included nearly all the activities one would expect. There was a well-executed evacuation for car owners; FEMA had pre-staged resources; the state of Louisiana and the federal Coast Guard had boats ready; and shelters were organized.
But, since Katrina-like disasters are so rare, they are impossible to predict with any kind of accuracy. The presenters continue:
“We know that the preparatory actions were not sufficient in light of the immense scale of destruction caused by Katrina, which we have qualified as a Black Swan event. Research tells us that Black Swans will happen and cause surprise, even if you stare them in the face. The question, then, is: what can we expect when something truly unexpected happens?”[7]
Hindsight bias in medicine
Hindsight bias is also a problem in the medical field, where doctors often must make a diagnosis with partial information. This can be challenging in cases where doctors are responsible for patient outcomes.
In one case, a radiologist missed a tiny, developing tumor in an x-ray of a 66-year-old man. Then, three years later, when the man returned after experiencing a chronic cough, chest pain, and weight loss, the tumor was finally discovered. He underwent treatment but died sixteen months later. The family sued the radiologist for malpractice.
When the case went to trial, the defense attorney asked experts who were not aware of the tumor to identify it, but none were able to. But when they saw the large tumor in the images taken three years later, they were immediately able to identify the small tumor in the original image.
But the plaintiff’s radiology experts weren’t convinced. They reviewed the original x-ray and claimed the tumor should have been caught. It’s the job of a radiologist to spot tumors, and this radiologist missed it.
Should the first radiologist have seen the tiny tumor? This was the central question of the case, and the case was ultimately decided based on what the jury believed the defendant should have known in hindsight.
In the end, the jury—which also knew the outcome—found the radiologist liable by a 10-to-2 vote.[8]
In another instance, an elderly man was given antibiotics after surgery, even though he was allergic to the medication, and even though this allergy was noted on his chart. Should the doctor be sued for malpractice and the doctor’s license suspended?
In instances where the patient died, 55% of physicians believed the doctor should be sued. But in instances where the patient survived, only 4% did.
But when the same question was posed to the public, 50% thought the license should be suspended in the case where the patient died, but 23% did when the patient lived.
In both cases, the actions, competence, and intent of the doctor were the same. The doctor was equally negligent in both cases. The doctor’s actions contributed no more to one outcome than the other. Yet the doctor’s actions, competence, and intent were judged more harshly when the outcome was negative.[9]
Hindsight bias in business
Hindsight bias can affect an entrepreneur’s ability to recall their potential for success or failure at the beginning of a venture. In one study, researchers surveyed 705 entrepreneurs from failed startups. Before the failure, 77.3% of entrepreneurs believed their startup would grow into a successful business with positive cash flow. But after the startup failed, only 58% said they had originally believed their startup would be a success. Next time you read a founder’s story or hear a founder interviewed in a podcast, keep in mind their recollection is approximately 20% wrong.[10]
It’s also important to be aware of hindsight bias when determining which events and actions led a business’s success or failure. For example, in May 2000, Fortune published a glowing account of Cisco, which stated:
Cisco, with CEO John Chambers at the helm, must be considered one of America’s truly outstanding companies, in the same league as Intel, Wal-Mart, and yes, GE.[11]
The article went into great detail about the steps Cisco had taken to become the premier company in Silicon Valley.
But a year later, after Cisco’s stock price fell from $80 to $14 and a layoff of thousands of employees, Fortune wrote ran a very different story:
On the way up to a stock market value of half a trillion dollars, everything about Cisco seemed perfect. It had a perfect CEO. It could close its book in a day and make perfect financial forecasts. It was an acquisition machine, ingesting companies and their technologies with great aplomb. It was the leader of the new economy, selling gear to new-world telecom companies that would use it to supplant old-world carriers and make their old-world suppliers irrelevant. Over the past year, every one of those characterizations has proved to be false.
Fortune went on to say that“dozens of conversation with customers, past and present Cisco executives, competitors, and suppliers” had confirmed that “Cisco made its own mess.”
What happened? Didn’t Fortune realize they had published articles stating the opposite? Fortune’s editor-at-large stated, “I think there is a pendulum effect, and that we all may get too caught up and perhaps over accentuate what’s going on.”
Peter Burrows, who wrote similar accounts of Cisco in Business Week, stated in a conversion which appears in Phil Rosenzweig’s The Halo Effect, that “Cisco had never really been a cost-conscious company.”
Yet this isn’t what was reported prior to Cisco’s fall. It’s difficult to go back and reconstruct what we may have thought when the outcome looked different. This is hindsight bias at work.
How hindsight bias was discovered
Hindsight bias was first studied experimentally during the mid-1970s as part of a broader research program on heuristics and biases conducted by Israeli psychologists Amos Tversky and Daniel Kahneman. One of Kahneman’s doctoral students, Baruch Fischhoff, was the first to document the existence of hindsight bias and measure its effects.
In his study, Fischhoff asked five groups of people to predict the outcome of the following event, which took place during the 1814 conflict between British and Nepalese soldiers:
(1) For some years after the arrival of Hastings as governor-general of India, the consolidation of British power involved serious war. (2) The first of these wars took place on the northern frontier of Bengal where the British were faced by the plundering raids of the Gurkas of Nepal. (3) Attempts had been made to stop the raids by an exchange of lands, but the Gurkas would not give up their claims to the country under British control, (4) and Hastings decided to deal with them once and for all. (5) The campaign began in November 1814. It was not glorious. (6) The Gurkas were only some 12,000 strong; (7) but they were brave fighters, fighting in territory well-suited to their raiding tactics. (8) The older British commanders were used to war in the plains where the enemy ran away from a resolute attack. (9) In the mountains of Nepal, it was not easy even to find the enemy. (10) The troops and transport animals suffered from the extremes of heat and cold, (11) and the officers learned caution only after sharp reverses. (12) Major-General Sir D Octerlony was the one commander to escape from these minor defeats.
Fischhoff then asked each of the five groups to predict the probability of four outcomes:
- British victory
- Gurka victory
- Stalemate with no peace
- Stalemate with peace
Four of the five groups were given each of the outcomes in advance, and the fifth group was not told an outcome.
By comparing the predictions by groups which were given an outcome with the group which wasn’t, he was able to measure how knowing an outcome affected the predictions. He found that knowing the outcome made people on average 10.8% more likely to view that outcome as inevitable.
When he asked people in each group why they made their predictions, he learned the data that supported the outcome was deemed more relevant to their decision. This is confirmation bias at work. People are more likely to notice and remember data that supports the outcome they have been given, and they are more likely to discard, forget, or view as irrelevant the data that doesn’t support the outcome they have been given.
In a different experiment, Fischhoff conducted interviews with people right before and right after President Nixon’s trips to China and the Soviet Union. He asked subjects to predict the likelihood of various events, such as Nixon meeting Mao Zedong, seeing Lenin’s tomb, or announcing the trip a success.
After Nixon’s return, Fischoff followed up. He asked subjects what their predictions had been, and whether the events they had predicted had happened. He found that subjects recalled giving high probabilities to events which happened, and low probabilities to events that didn’t. For example, someone who had predicted the trip a failure was less likely remember their earlier prediction since the trip had turned out a success. Now that they knew Nixon’s trip had succeeded, they couldn’t conceive of earlier predictions for any other outcome.[12]
Measuring hindsight bias
In another study, psychologists asked participants to watch a video of an event as it unfolded, such as a car accident. At various stages, they paused the video and asked participants to predict the likelihood of various outcomes, such as no accident, minor accident, or serious accident.
Later, the subjects were once again asked to watch the same video but were told to disregard their outcome knowledge. Instead, they were to predict the likelihood of the various outcomes as if they didn’t know what really happened—to evaluate the scene as objectively as possible from the perspective of someone who hadn’t seen it.
In hindsight, people predicted the outcome more quickly. In this way, hindsight bias isn’t simply “knowing it all along.” It’s knowing it earlier than they remembered.[13]
4 dangers of hindsight bias
Most of the negative effects of hindsight bias can be grouped into four categories. Let’s take a look at each of these:
1. Hindsight bias prevents us from learning from experience
It’s common knowledge that you ought to learn from your mistakes. To do this, you need to evaluate how actions you took in the past contributed to failure. Conversely, avoiding future failure means not repeating past mistakes.
But the truth is, the advice to learn from your mistakes isn’t really practical or helpful. That’s because this advice assumes we can correctly infer causes from effects and accurately assess why something happened to produce an outcome. The problem is that identifying causal relationships between events and outcomes isn’t as easy as it seems (we’ll see why in a moment).
If hindsight bias blinds us to the correct causes of bad outcomes, then we’re more likely to repeat the actions which led to those bad outcomes. Thus, hindsight bias makes it difficult to learn from experience.
2. Hindsight bias causes us to wrongly assign blame
Negative outcomes require an explanation more than neutral or positive outcomes. (We’ll see why in a moment.)
This makes sense if you think about it.
Let’s say sales are down last quarter or the project you’re working on failed. If that’s the case, then you must explain why. Why are sales down? Why are you failing?
But now let’s say sales are exactly what you predicted, or the project you’re working on is going better than you could have expected. In those cases, you don’t need to explain yourself.
Average and expected events don’t require any kind of special explanation.
This is especially important for managers to keep in mind. If you manage people, then you must explain not only your behavior but also the behavior of your direct reports.
While employees make decisions without knowing the complete outcome of their decisions, managers evaluate those decisions only after learning the outcome.
An employee with an excellent performance history could make the best possible decision based on all available information and company protocol, but a negative outcome could still occur because of circumstances beyond the employee’s control.
Because we are uncomfortable extreme outcomes—negative or positive—we go to great lengths to explain how they happen. And sometimes this means we assign blame where none exists.[14]
3. Hindsight bias causes us judge others too harshly
When we know the outcome of an event, we are also more likely to blame victims for their actions than if we don’t know the outcome.
This is a critical problem in cases of blaming women who have been raped. In one study, people read an account of an interaction between a man and a woman. The accounts had two different endings. One ended with, “The next thing I knew, he raped me.” The other read, “The next thing I knew, he took me home.”
Those who read the story that ended in rape were 8.4% more likely to say the woman’s actions contributed to the rape. They were less likely to say the woman’s actions contributed to the man taking her home.
In other words, when the woman was raped, observers drew a causal link between her actions—the things she said, the clothes she wore—and the rape. But when she was not raped, they inferred no causality between her actions and the outcome.[15]
Hindsight bias not only changes our perception of how events contribute to outcomes, but it makes us more likely to assign blame or behave cruelly to victims. This is unjust and unfair. It’s a tendency we need to intentionally counteract.
4. Hindsight bias makes us overconfident
Hindsight bias also creates a gap between our actual performance and our perceived performance. One study found that investment bankers who regularly exhibited hindsight bias earned far less than their peers who didn’t and that more experienced bankers were just as biased as less experienced bankers.[16]
If you, in hindsight, believe you “knew it all along,” then you have little motivation to find new ways to solve old problems. This will have a negative impact on your ability to infer causality in the future.
If you’re an expert in your field or you have years of experience, you are especially susceptible to hindsight bias. (There is a specific kind of expertise that’s an exception to this rule, as we’ll see in a moment.)
3 kinds of hindsight bias
Not all hindsight bias is alike. You view events in hindsight in 3 primary ways:
1. You mis-remember
In some cases, you simply mis-remember what you previously thought. Before an event, you may have said it wouldn’t happen. After the event, you said it would have happened.
This isn’t the same thing as forgetting.
It’s that you’re unable to conceive of a reality different from what you now know.
To see what I mean, listen to the following two recordings.
First, here’s an initial recording of garbled noises. Most people can’t describe what it is, although you might be able to. Click to listen:
Can you guess what’s being said?
Now, listen to the clear recording:
Once you have listened to the second recording, it’s impossible to go back to the first recording and hear garbled sounds.
(There are several more examples here.)
Hindsight bias works in the same way. Once you know the outcome, you can’t conceive of a reality that could have contributed to a different outcome. You can’t un-hear the first recording.
2. You view events as inevitable
In other cases, you view an outcome from a position of inevitability. You’ve probably heard phrases such as:
- “Given the state of the economy in 2012, Obama was going to win.”
- “Given the angst among white working class males in 2016, Trump was going to win.”
Events were destined to happen in a certain way.
3. You view events as foreseeable
This is perhaps the most common form of hindsight bias. It could be called the I-knew-it-all-along effect. Here are a few examples:
- “I knew the stock would go up when I bought it.”
- “I sold the stock because I knew it would go down.”
- “I knew Trump would win in 2016.”
In fact, you probably didn’t know then what you know now with the same degree of confidence.
Get more articles about cognitive biases, consumer psychology, and human behavior.
Why you fall for hindsight bias
Hindsight bias is so pervasive, and so obviously problematic. So why do you keep falling for it?
1. Memory is not designed for accuracy
Memory feels linear and ordered. It’s as if you take in everything as you experience it and file it away for reference later.
When later comes—when you need to recall what happened—you go back, open the file, and pull out what’s there. You evaluate a pile of memories and work forwards to come up with an outcome.
But this isn’t how memory works, even though it feels like it.
Instead, here’s how memory works.
Your perception of the world is fragmented. You remember experiences—what you saw, how you felt—in small pieces. You only take in the very minimum necessary, and you retain even less.
This means you face two problems when you go back in mental time:
- Because you take in only a limited portion of what you experienced, there’s not much there to begin with.
- Your retrieval process is broken: you remember only a fraction of what you experience.
As you can see, our brains are not designed to remember the past with accuracy.
This is so counter-intuitive—so different from how you experience remembering—that neuroscientists have figured this out only recently.
One study asked people to talk about and think about specific events from their past. After they had done so, researchers asked people to talk about and think about made-up, imaginary events in the future. They noticed activity in the same parts of the brains for both tasks—whether the events had actually happened or not.[17]
From your brain’s perspective, remembering the past and imagining the future are the same thing. Let that sink in for a moment.
If I ask you what you had for breakfast, your brain goes through a set of steps to produce an answer.
If I later ask you how you plan to ride your orange unicorn home from work when its only horn is in the shop—your brain looks exactly the same.
In other words, your brain goes through the same steps in order to explain both why actual events happened in the past and why impossible events might happen in the future.
Memory involves constructing a reality from incomplete fragments of information, and it doesn’t matter if that information comes from real, actual, past events, or hypothetical, made-up future events. As two leading researchers of this phenomenon have put it:
This flexibility of episodic memory makes it well suited to supporting simulations of different ways that future experiences might unfold, but according to the constructive episodic simulation hypothesis, this very flexibility can also result in memory errors from mis-combining elements of past experiences or confusing imagined and actual events.[18]
Because your brain can barely distinguish the real from the imagined, if you prime your brain with an outcome, then your memory is more likely to sample data that conforms with the outcome than data that doesn’t.
Part of the reason you’re so susceptible to hindsight bias is that your brain is evaluating two things:
- A real, present, known outcome.
- A pile of memories it barely experienced, mostly forgot, and can hardly reconstruct.
When the brain is faced with these two inputs, it’s extremely difficult to evaluate the past when the present is already known. It’s no wonder you’re so susceptible to hindsight bias.
But it gets worse.
Next, we’ll explore how breakdowns happen in the memory-retrieval process.
2. You confuse “easy” with “true”
When it’s easy for you to retrieve information from memory, then you are more likely to believe it is true.
Thus, if it’s easy for you to connect events with outcomes, you are more likely to believe a connection really exists—compared to when it’s difficult to connect events with outcomes. (Remember that in reality, there may not be a real connection.)
In one study, people were asked to produce either two or ten explanations for an event. People who produced two explanations showed more hindsight bias than people who produced ten explanations.[19]
This makes sense if you think about it.
Imagine a scenario where the roof of a car is smashed in, a tree has fallen next to the car, and you remember it being windy last night. If I ask you to produce two explanations, then one of them will probably be that the wind blew the tree over and it fell onto the car. But if I ask you to produce lots of additional explanations, then each explanation will be more difficult to come up with than the last. By the tenth, twentieth, or thirtieth explanation your car is smashed by rogue construction equipment or failed Falcon Heavy launches.
Each explanation is less believable than the first explanation. Not because it’s more true or less true, but because it’s not as easy to come up with each successive explanation. From your brain’s perspective, easy equals true.
When you are biased by hindsight, your brain searches for the easiest explanation for an outcome, regardless of whether that outcome is true. The easiest connection becomes the best explanation.
3. You want to be in control of our actions and outcomes
Another reason we are biased by hindsight is that we desire to be at the center of whatever causal relationship we observe between an event and an outcome. This is known as the illusion of control.
This isn’t a bad thing. There are good reasons to view the world as ordered and predictable and to view yourself at the center of it. The fact that you can exercise influence, that your thoughts and words and actions matter, isn’t just an illusion, it’s mostly true. It’s what allows us to be well-ordered and functioning human beings.
But it also explains why we are not comfortable with events we can’t explain or with outcomes produced by random chance. When we encounter outcomes beyond our control (whether we realize it or not), we trick ourselves into thinking we can control the outcome.
A series of studies presented people with outcomes that were produced by random chance.
In the first study, people were asked to pick lottery numbers. Next, they were asked to have others choose their lottery numbers. Even though the odds of winning remained the same whether they picked their own numbers or someone else picked the numbers, people strongly resisted having others pick the numbers.[20]
In another, more elaborate study, people were shown a series of coin tosses and asked to predict whether the next toss would be heads or tails. The answer was given by one of the researchers, and the subjects of the study didn’t see how the coin landed. As a result, the researchers were able to manipulate the outcome, so that correct guesses were given at the beginning of the series.
Because people correctly guessed the outcome of the coin toss at a greater-than-average rate, they came to believe they were someone responsible for their correct guesses, that they had developed a skill to make a correct guess. But as the series of coin tosses progressed, they became overconfident and began performing worse than expected. The researchers learned that when people experience positive outcomes from events that happen as a result of random chance, they believe they have acquired a skill or figured out how to cause that outcome. (Oddly, when a negative outcome occurs, they blame poor luck.)[21]
Hindsight bias cloaks random, chance events with a sense of order, predictability, and an illusion of control. It offers meaning when outcomes are meaningless. And it helps you go back in time, construct a narrative of what happened, and place yourself within the narrative as an agent bringing about the outcome.
Or, as Stanford professors James March and Zur Shapira have written:
Post hoc reconstruction permits history to be told in such a way that ‘chance,’ either in the sense of genuinely probabilistic phenomena or in the sense of unexplained variations, is minimized as an explanation.[22]
We are not comfortable with randomness.
4. You don’t want to be wrong
Perhaps the most significant reason you’re biased by hindsight is that, at the deepest level, you don’t want to be wrong. You don’t want to find inconsistency between your past self and your present self.
It makes sense why this is so. If you’re found out to be wrong, then you are responsible.
On the other hand, you can’t be blamed for something you didn’t see coming. You’re not responsible for the outcome. And if you’re not responsible for the outcome, then your reputation and credibility as a person remain intact.
When people are in control of a situation that results in an outcome they didn’t predict—such as a relationship breakup or losing a job—they make claims like “I could never have seen it coming.” But when people are not in control of a situation that results in an outcome they didn’t predict—such as electoral outcomes or winning a game—then they make claims like “I just knew it would happen.”[23]
We’re back to the I-knew-it-all-along effect we saw earlier.
Hindsight bias resolves the inconsistency between your past self and your present self. You want what you think now to be consistent with what you thought then. It’s easier for your mind to change the past than for the past to change your mind.
Get more articles about cognitive biases, consumer psychology, and human behavior.
You find causality where there isn’t any
At the root of hindsight bias is your inability to identify causality. To start, let’s look at the three kinds of causality.
3 kinds of causality
- One-to-one causal relationships: For some outcomes, one cause produces one effect, and one effect has only one cause. For example, two plus two equals four, and four equals two plus two.
- One-to-many causal relationships: For other outcomes, many causes produce one effect. For example, a failed harvest may have many causes: insects, lack of rain, and so on.
- Many-to-many causal relationships: For these outcomes, many causes can produce many complex effects. For example, the 2008 housing crisis had many contributing causes that led to many outcomes; it would be impossible to identify all the causes and connect them to their corresponding effects. (This is why historians are still debating why the Great Depression started and how it ended.)
As we move from one-to-one causal relationships to many-to-many causal relationships, we run into trouble. Our brains can compute many-to-many causal relationships when thinking about events that may happen in the future, but our brains cannot do so when thinking about events that happened in the past. This is why we construct cause-and-effect relationships where none exist.
To illustrate, let’s look at the example of the 1979 crash of Western Airlines flight 2605.[24]
There were five events that contributed to the crash:
- Fifteen minutes before the crash, one of the pilots said “I think I got about 3 hours sleep this afternoon,” and “I think I’m gonna sleep late all night.” He had, in fact, had slept only 4 hours of the previous 24 hours, and his co-pilot had slept only 5. They were tired.
- Eleven minutes before the crash, the pilots reported low visibility. They say things like “Yeah, smoke over the city,” and “Look at that smog.”
- Two minutes before the crash, the pilots lost contact with the airport, which caused confusion.
- The airport had two runways. The control tower told the pilots to land on the right runway—which was illuminated—while following the radar beam of the left runway—which was not illuminated. Right before landing, they were to turn to the right so they could land on the correct runway. One minute and five seconds before the crash, the control tower reported to the pilots that they were too far to the left. The pilots responded but didn’t realize they were to the left of the left runway.
- Forty-three seconds before the crash, the control tower gave the pilots incorrect information about which runway was open: “Ok, sir, ok? Approach lights on runway 23 left but that runway is closed to traffic.” But the reverse was true: the right runway had lights on, but the left runway had the beam.
At 13 seconds to landing, the pilots realized their error, but it was too late to climb out. The plane landed on the wrong runway and crashed into a truck. The pilots tried and failed to get airborne again. The crash killed 72 of the 88 people on board.
The crash can be explained in hindsight by one or all of these causes: fatigue, miscommunication, poor weather, construction equipment on the runway. And it would be easy to construct a narrative in hindsight that predicts a crash based on these factors.
But it would be impossible to construct this narrative in foresight.
That’s because each of these factors happens all the time and almost never lead to a crash. Pilots often fly tired. Miscommunication happens. Planes fly in low visibility (and are designed to do so). Airports are often under construction. There are thousands of flights every day that encounter some or all of these conditions, yet planes rarely crash.
Just as the antecedent events cannot predict airplane crashes that may happen in the future, they also cannot explain airplane crashes that have happened in the past.
How causality works
Let’s unpack causality in a bit more detail.
1. There are as many causes as there are explanations
Several decades ago, philosopher Norwood Russell Hanson observed:
Consider how the cause of death might have been set out by the physician as ‘multiple haemorrhage’, by the barrister as ‘negligence on the part of the driver’, by the carriage-builder as ‘a defect in the brakelock construction’, by a civic planner as ‘the presence of tall shrubbery at that turning’. None is more true than any of the others, but the particular context of the question makes some explanations more relevant than others.”[25]
2. We look for causes of exceptional events, not average events
If the underdog wins, we need to find a reason.
But if we wake up the morning, make coffee, and have breakfast, we don’t search for reasons why we did this.
That’s because average events don’t require causes. They just seem to happen.
We don’t look for causes for “normal,” we look for causes from deviations from “normal.”
3. We believe causal relationships are more likely to be true than non-causal relationships
If two things have a cause and effect relationships, people are more likely to trust it. One study showed groups of people two statements:
- Scenario 1: a girl has blue eyes if her mother has blue eyes.
- Scenario 2: a mother has blue eyes if her daughter has blue eyes.
People viewed the first scenario to be more believable, even though both statements are equally true.[26]
Putting it all together: hindsight bias and the stories we tell ourselves
Let’s review where we’re at:
- We have explored the pervasive nature of hindsight bias: in politics, government, healthcare, and business.
- We have seen how Baruch Fischhoff experimentally demonstrated the existence of the bias, and how the bias has been tested over the years.
- We looked at four of the dangers of hindsight bias.
- We looked at the three kinds of hindsight bias
- We explored four reasons we fall for hindsight bias.
- Last, we’ve looked at the complex relationship between multiple causes and an effect, and why our neurobiology is ill-equipped to connect past events with present outcomes.
Now let’s put it all together.
The structure that binds these factors together to produce hindsight bias is narrative.
Here’s how it works:
- People try to connect antecedent events and outcomes to create a causal relationship.
- This, combined with our discomfort with randomness, leads to a sense of inevitability where none exists.
- If the outcome can be explained by one (of many) antecedent events, then the brain confirms a causal relationship.
- Our brain’s desire for speed and efficiency often wins over our brain’s desire for accuracy.
- As a result, we sample only the evidence most relevant to the outcome. Usually, that’s either the easiest and most accessible cause or the first cause we think of.
- Hindsight bias simultaneously prevents us from looking for and evaluating evidence that conflicts with our outcome.
The result?
A story. The kind of story with a problem and a solution, with tension and resolution, and with all the loose ends tied up.
And not just any story. A great story.
You’re more likely to think a good story is true than a bad story. And you’re more likely to believe someone if they’re a great storyteller than if they’re a poor one.
Great stories feel true, even when we make them up.[27]
And some of the greatest stories are the ones we make up about ourselves.
Joan Didion is well-known for her essay, “We Tell Ourselves Stories in Order to Live.” But there’s more in the paragraph behind the first line:
“We tell ourselves stories in order to live…We look for the sermon in the suicide, for the social or moral lesson in the murder of five. We interpret what we see, select the most workable of the multiple choices. We live entirely, especially if we are writers, by the imposition of a narrative line upon disparate images, by the ‘ideas’ with which we have learned to freeze the shifting phantasmagoria which is our actual experience.”[28]
At the root of hindsight bias is our desire to map our experiences—past, present, or future; real or imagined—onto a narrative structure. This desire overrides our aims for accuracy or truth. And it’s a desire that helps us find meaning, agency, and purpose in the world, without regard for the contrived way we go about getting it.
(The next line in Joan Didion’s essay is “At least we do for a while.”)
How to avoid hindsight bias
The good news is that there are some things we can do to avoid falling for hindsight bias in some cases.
1. Find counterfactuals (but not too many)
Finding counterfactuals helps us conceive of different causes for the same outcome.
However, each new counterfactual becomes more difficult to create; creating ten counterfactuals is more difficult than creating two counterfactuals. We are less likely to find the tenth counterfactual to be true; we’ll compare it to the tenth counterfactual to our original, hindsight-bias-informed assumptions, and find the counterfactual more inconceivable. Hindsight bias wins again.
A good hedge against hindsight bias is to identify only a few counterfactuals, but not too many. Aim for two; if you’ve reached ten, you’ve gone too far.
2. Become an expert (in some cases)
If you have more experience, then it seems you’re less likely to be susceptible to hindsight bias.
One study that seems to support expertise as a hedge against hindsight bias involved participants who swung real bats at simulated balls projected onto a screen. Participants predicted how well they hit each ball: how far, which direction, and so on. The batters were also shown where the ball actually went. The study showed a smaller hindsight bias when participants were more experienced.[29]
On the other hand, you have a greater sample of possible causes for an outcome, which makes you more likely to get it wrong. We saw earlier that experienced investors were just as biased as less experienced bankers (and didn’t earn more).[30] We also saw that expert radiologists couldn’t un-see a tiny tumor in an x-ray after knowing it was there.[31] Expertise doesn’t always help you avoid hindsight bias.
So which is it? More experience, or less?
It seems that domains which present immediate feedback, such as chess or weather forecasting (or hitting a baseball) help people develop the correct causal connections between outcomes and events. This helps them make better predictions going forward.
But other domains, where feedback is delayed, do not help experts avoid hindsight bias. Examples of these kinds of domains include judges and talk show pundits (or bankers and radiologists), where the feedback may be delayed by weeks or months—or never received at all.
3. Find the past in its original form
Depending on the situation, you may be able to find the past in its original form through primary source documents. This could come in the form of newspaper articles, meeting minutes, or emails dated prior to the outcome. Referring to the past as it existed then usually provides more accuracy than reliance on memory retrieval.
It’s not bias-proof, though. No matter how many primary sources you find, you can still piece them together in incorrect ways. (This is why history continues to be written and rewritten.)
4. Find people who haven’t experienced the event
This doesn’t work in every case. It wouldn’t be possible to view the events of World War II, or Nixon’s visit to China, or 9/11 from the perspective of foresight. There are too many people too familiar with these events, and, in many cases, an explanatory historical narrative has already been written.
But in other cases, it’s easy to find people who haven’t experienced an event. (In the case of the radiologist who missed a tiny tumor on an x-ray, the defendant’s lawyers employed this approach.)
Anonymize the events, remove the outcome, and send them to people unaware of the situation and see what kinds of outcomes they predict.
***
It’s not possible to overcome hindsight bias completely. Perhaps the main lesson we ought to learn is to approach claims about the past—either our own, or those of others—with skepticism, or, better yet, humility.
Get more articles about cognitive biases, consumer psychology, and human behavior.
[1] Keegan, Victor. “Will MySpace ever lose its monopoly?” The Guardian, February 8, 2007.
[2] Starr, V., McCormick, M. (2009). Jury Selection. Aspen Publishers.
[3] Hastie, R., Schkade, D. A.,&Payne, J. W. (1999). “Juror judgments in civil cases: Hindsight effects on judgments of liability for punitive damages.” Law and Human Behavior, 23(5): 597–614.
[4] Casper, J. D., Benedict, K., & Kelly, J. R. (1988). “Cognitions, attitudes and decision–making in search and seizure cases.” Journal of Applied Social Psychology, 18(2): 93–113.
[5] Dietrich, D. and Olson, M. (1993). “A demonstration of hindsight bias using the Thomas confirmation vote.” Psychological Reports 72: 377–378.
[6] 2006. “A Failure of Initiative: Final Report of the Select Bipartisan Committee to Investigate the Preparation for and Response to Hurricane Katrina.” Washington.
[7] Boin, A. and Richardson, J. (2015), “Crisis management in Katrina’s immediate aftermath: Lessons for a national response to a super disaster.” Unpublished paper presented at the Katrina@10 Conference.
[8] Berlin, L. (2000). “Hindsight bias.” American Journal of Roentgenology, 175(3): 597–601.
[9] Blendon, R. J., DesRoches, C. M., Brodie, M., Benson, J. M., Rosen, A. B., Schneider, E., Altman, D. E., Zapert, K., Herrmann, M., & Steffenson, A. E. (2002). “Views of practicing physicians and the public on medical errors.” New England Journal of Medicine, 347: 1933–1940.
[10] Gavin Cassar and Justin B. Craig. (2009). “An investigation of hindsight bias in nascent venture activity.” Journal of Business Venturing, 24 (2): 99-196.
[11] Serwer, A. (2000). “There’s something about Cisco.” Fortune. This account is taken from Rosenzweig, P. (2007), The Halo Effect… and the Eight Other Business Delusions That Deceive Managers, The Free Press.
[12] Fischoff, B. (1975). “Hindsight ≠ foresight: the effect of outcome knowledge on judgment under uncertainty.” Journal of Experimental Psychology: Human Perception and Performance: 288–99.
[13] Fessel, F., Epstude, K., Roese, N. (2009). “Hindsight bias redefined: It’s about time.” Organizational Behavior and Human Decision Processes, 110(1): 56–64.
[14] Schkade, D. (1991). “Expectation–outcome consistency and hindsight bias.” Organizational Behavior and Human Decision Processes 49: 105–123.
[15] Janoff-Bulman, R., Timko, C., Carli, L. (1985). “Cognitive biases in blaming the victim.” Journal of Experimental Social Psychology 21(2): 161–177.
[16] Biais, B., Weber, M. (2009). “Hindsight bias, risk perception, and investment performance.” Management Science vol. 55(6): 1018–1029.
[17] Szpunar, K. K., Watson, J. M. & McDermott, K. B. (2007). “Neural substrates of envisioning the future.” Proceedings of the National Academy of Sciences USA, 104: 642–647.
[18] Schacter, D. and Madore, K. (2017). “Remembering the past and imagining the future: Identifying and enhancing the contribution of episodic memory.” Memory Studies 9(3): 245–255. There’s also a great summary in Schacter, D., Addis, D., and Buckner, R. (2007). “Remembering the past to imagine the future: the prospective brain.” Nature Reviews, 8: 657–661.
[19] Sanna, L., Schwarz, N., & Stocker, S. (2002). “When debiasing backfires: accessible content and accessibility experiences in debiasing hindsight.” Journal of Experimental Psychology: Learning, Memory, and Cognition, 28(3): 497–502.
[20] Langer E. (1975., “The Illusion of Control.” Journal of Personality and Social Psychology, 32(2): 311–328.
[21] Langer, E. and Roth, J. (1975), “Heads I Win, Tails It’s Chance: The Illusion of Control as a Function of the Sequence of Outcomes in a Purely Chance Task.” Journal of Personality and Social Psychology, 32(6): 951–955.
[22] March, J. and Shapira, Z. (1987), “Managerial perspectives on risk and risk-taking.” Management Science, 33(11), 1404–1418.
[23] Roese, N. and Vohs, K. (2012). “Hindsight bias.” Perspectives on Psychological Science, 7(5): 411–426.
[24] There’s a more detailed account of this crash in Dawes (1993). “Prediction of the future versus an understanding of the past: A basic asymmetry.” American Journal of Psychology 106(1): 1–24.
[25] Hanson, N. R. (1958). Patterns of discovery: An inquiry into the conceptual foundations of science. Cambridge, England: Cambridge University Press.
[26] Tversky, A. & Kahneman, D. (1980). Causal schemas in judgments under uncertainty. Progress in social psychology, 49-72, M. Fishbein (Ed.). Erlbaum.
[27] Blank, H., & Nestler, S. (2007). “Cognitive process models of hindsight bias.” Social Cognition, 25: 132–146.
[28] Didion, J. (1979). The White Album. Simon & Schuster.
[29] Gray, R., Beilock, S. L., & Carr, T. H. (2007). “As soon as the bat met the ball, I knew it was gone”: Outcome prediction, hindsight bias, and the representation and control of action in novice and expert baseball players. Psychonomic Bulletin & Review, 14: 669–675.
[30] Biais, B., Weber, M. (2009). “Hindsight bias, risk perception, and investment performance.” Management Science vol. 55(6), 1018-1029.
[31] Berlin, L. (2000). “Hindsight bias.” American Journal of Roentgenology, 175(3): 597–601.