“hindsight bias” refers to the tendency for uncertain outcomes to seem more likely once we know the outcome that has occurred. Because of it, we are prone to view what has already happened as relatively inevitable and obvious, not realising how the information about the outcome has affected us.
One of the first psychologists to investigate hindsight bias was Baruch Fischoff who, together with Ruth Beyth, used President Richard Nixon's historically important 1972 diplomatic visits to China and Russia as the focus for a study. Before the visits took place, participants were asked to assign probabilities to 15 possible outcomes, such as whether the U.S. would establish a diplomatic mission in Peking or establish a joint space programme with Russia. Two weeks to six months after the visits had taken place, the same people were asked to recall what their earlier predictions had been. The results were clear. The majority of participants inflated their estimates for the outcomes that had occurred while remembering having assigned lower probabilities to those that had not. This bias also became stronger as the time between the initial prediction and the recall task increased. Many other events that captured public attention have since been studied, with similar results.
The heart of the problem seems to be that once we adopt a new understanding of the world, we immediately find it difficult to reconstruct past beliefs with any accuracy. This inevitably causes us to underestimate our own level of surprise at past events and, on the flip side of the coin, explains why it is so easy to be surprised when others overlook the obvious, as NASA did in the run-up to the Challenger accident.
Hindsight, because it is always 20:20, ensures that we feel on safe ground when criticising others’ irrationality or lack of foresight; moreover, it simultaneously reduces our ability to evaluate past decisions objectively (our own or those of others). It can have an extremely detrimental impact on both decision making and decision makers:
• Decisions that don't work out can often be punished, because the variety of factors that were outside the control of the decision maker are difficult to recognise after the event.
• If decision makers come to expect that their decisions will be scrutinised with hindsight, they are much more likely to seek risk-averse and bureaucratic solutions.
• Irresponsible risk seekers can be undeservedly rewarded when their decisions work out because it is hard to recognise their gamble, so they don't get punished for taking too much risk. Meanwhile, anyone who doubted them may get branded as conventional, over-cautious, or plain weak.
• Perhaps most importantly, hindsight severely reduces our ability to learn from past decisions. We'll look at why this is so important in the next couple of chapters.
We are all susceptible to hindsight bias, but it can be very difficult to recognise what is happening.
Running on Instinct
Psychologists use the term heuristics to describe the unconscious mental shortcuts that we take to arrive at judgments or solve problems. To date, dozens of them have been identified; hindsight bias being just one example. When we are faced with difficult questions, high complexity or ambiguity, or a need for high speed, heuristics can help us to find answers or solutions that would otherwise be beyond conscious reach. However, because they evolved to enable us to cope with an evolutionary past when we were living on the plains, hunting and gathering, the biases they introduce are often imperfect and may lead to terrible mistakes.
Mental shortcuts can even lead to inappropriate biases in life or death situations, as demonstrated by a study by Amos Tversky which looked at how the way that data is presented can affect doctors’ choices. All of the participants received the same data on the effectiveness of two interventions for lung cancer: surgery and radiation treatment. It indicated that radiation offered a much better chance of survival in the short term, but a lower life expectancy over the next few years.
For half of the participants the data was presented in relation to survival rates, whilst for the others it was provided in terms of death rates; for example, the statistics for the surgical treatment of 100 patients were as follows:
Clearly, from a mathematical/logical point of view, the two columns of data are exactly the same, yet 82 % of the doctors presented with the survival data recommended surgery versus only 56 % of those who were given the opposite perspective. Studies like this demonstrate the enormous influence that heuristics can have on our decision making; in particular, how difficult it is for us to divorce decisions from their emotional components.
Heuristics can be considered to be much like instincts. Animal instincts are easy to recognise; indeed, we assume that this is how animals do pretty much everything. As human beings, however, we generally prefer to think of ourselves as rational. We like to hang on to the evidence of our conscious experience, which suggests that our experience of the world is “accurate” and that we form beliefs and opinions based on the facts of the situation. Social psychologist Lee Ross called this conviction “naïve realism” – the conviction that we have the ability to experience events as they are. It enables us to justify any opinion as reasonable, because if it wasn't we wouldn't hold it! Sounds great, doesn't it? And it is completely wrong. The logic of this kind of thinking does not bear scrutiny, but that's okay because it's an easy choice not to investigate …
Конец ознакомительного фрагмента.
Текст предоставлен ООО «ЛитРес».
Прочитайте эту книгу целиком, купив полную легальную версию на ЛитРес.
Безопасно оплатить книгу можно банковской картой Visa, MasterCard, Maestro, со счета мобильного телефона, с платежного терминала, в салоне МТС или Связной, через PayPal, WebMoney, Яндекс.Деньги, QIWI Кошелек, бонусными картами или другим удобным Вам способом.