Nicholas Michael

The Little Black Book of Decision Making


Скачать книгу

necessary to put the principles I'll be covering into practice. Know that if you do nothing, that is still a decision – it's a decision in favour of the status quo. It's a decision not to change. It's a decision to put comfort ahead of opportunity. And, as you will see, it's a decision that risks you waking up one day and realising that the world has changed and that you didn't take advantage of it. Or, alternatively, you could take the decision to get ahead of the curve, so that you will start to accrue benefits in advance and be ready when the full force of the coming transformation hits us.

      Part One

      No Place for Old Dogs: New Tricks Required

      1

      Let's Get Real: We All Make Mistakes

      At 11.38 a.m. on 28 January 1986, the NASA space shuttle Challenger took off from Kennedy Space Centre at Cape Canaveral, Florida. Seventy-three seconds later, as it broke up, the liquid hydrogen and oxygen that was by then streaming from its ruptured fuel tanks explosively caught fire and enveloped the rapidly disintegrating spacecraft. The deaths of its seven crew members – including Christa McAuliffe, who would have been the first teacher into space – in such a catastrophic and shockingly visible way may well be the reason why this disaster, despite it having no real impact on the lives of the vast majority of those observing it, became the third fastest spreading news story ever.

      Following the accident, U.S. President Reagan rapidly set up a special commission (known as the Rogers Commission, after its chairman) to investigate it. The consensus of its members was that the disintegration of the vehicle began after the failure of a seal between two segments of the right solid rocket booster (SRB). Specifically, two rubber O-rings designed to prevent hot gases from leaking through the joint during the rocket motor's propellant burn failed due to cold temperatures on the morning of the launch. One of the commission's members, theoretical physicist Richard Feynman, even demonstrated during a televised hearing how the O-rings became less resilient and subject to failure at the temperatures that were experienced on the day by immersing a sample of the material in a glass of iced water. There is no evidence that any other component of the space shuttle contributed to the failure.

      I've found, from years of asking participants in my decision-making workshops, that most people's memory of that day aligns with the summary in the paragraphs above. Though relatively few are aware of the precise name of the actual component involved, they consistently remember only the seal failure. This root cause appears unambiguous. So why would the Rogers Commission have concluded, as they did, that the key factors contributing to the accident were NASA's organisational culture and decision-making processes, not the technical fault? We need to take a deeper look.

      First Appearances are Often Deceptive

      Full details of the events leading up to the Challenger disaster are a matter of public record,2 so I won't recount them in detail here. Bear in mind as you read the string of glaring errors below that this was the same organisation that achieved the incredible feat of landing men on the moon and returning them home safely, and which resolutely refused to succumb to the enormous challenges it faced in getting the stricken Apollo 13 crew back home safely when that mission suffered an oxygen tank explosion over two hundred thousand miles from Earth.

      Let's return to that ill-fated Tuesday morning in January 1986. Several key facts shed light on the finding of the Rogers Commission that decision-making errors were at the heart of the catastrophe:

      • The O-rings had not been designed for use at the unusually cold conditions of the morning of the launch, which was approximately -2 °C. They had never been tested below 10 °C, and there was no test data to indicate that they would be safe at those temperatures (which were around 14 °C lower than the coldest previous launch).

      • NASA managers had known for almost a decade, since 1977, that the design of the shuttle's SRB's joints contained a potentially catastrophic flaw. Engineers at the Marshall Space Flight Centre had written to the manufacturer on several occasions suggesting that the design was unacceptable, but the letters were not forwarded to Morton Thiokol, the contractor responsible for construction and maintenance of the SRBs.

      • Engineers raised specific warnings about the dangers posed by the low temperatures right up to the morning of the launch, recommending a launch postponement; but their concerns did not reach senior decision makers. The night before the launch, Bob Ebeling, one of four engineers at Morton Thiokol who had tried to stop the launch, told his wife that Challenger would blow up.3

      • In 1985, the problem with the joints was finally acknowledged to be so potentially catastrophic that work began on a redesign, yet even then there was no call for a suspension of shuttle flights. Launch constraints were issued and waived for six consecutive flights and Morton Thiokol persuaded NASA to declare the O-ring problem “closed”.

      • While the O-rings naturally attracted much attention, many other critical components on the aircraft had also never been tested at the low temperatures that existed on the morning of the flight. Quite simply, the space shuttle was not certified to operate in temperatures that low.

      • It seems that one of the most important reasons why NASA staff opposed the delay may have been that the launch had already been delayed six times. Two of its managers have been quoted as saying, “I am appalled. I am appalled by your recommendation”, and “My God, Thiokol, when do you want me to launch?”4

      With this broader awareness it is easy to recognise that the technical, and obvious, “cause” of the accident – the O-ring failure – was really just an outcome of the complex structural problems arising from the relationships between the parties involved. Now, I expect that the Commission's conclusion seems completely unsurprising:

       Failures in communication … resulted in a decision to launch 51-L based on incomplete and sometimes misleading information, a conflict between engineering data and management judgments, and a NASA management structure that permitted the internal flight safety problems to bypass key Shuttle managers. 5

      A report by the U.S. House Committee on Science and Technology went further. It agreed with the Rogers Commission on the technical causes of the accident, but was more specific about the contributing causes:

       The Committee feels that the underlying problem which led to the Challenger accident was not poor communication or underlying procedures as implied by the Rogers Commission conclusion. Rather, the fundamental problem was poor technical decision-making over a period of several years by top NASA and contractor personnel, who failed to act decisively to solve the increasingly serious anomalies in the Solid Rocket Booster joints. 6

      The Problem with Hindsight

      In examining the events leading up to the Challenger accident, it would be completely understandable to have the urge to scratch your head and wonder how so many obviously intelligent people (we are talking about rocket science, after all) could have displayed such apparent ineptitude. How did NASA, an organisation that places such importance on safety, end up so flagrantly violating its own rules and appear to have so little regard for human life?

      “Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”

      – Daniel Kahneman, Nobel Prize-winning Professor of Psychology and international best-selling author on judgment and decision making

      When a decision has gone badly, the benefit of hindsight often makes the correct decision look as though it should have been blindingly obvious. But once you are aware of this bias, you'll see it everywhere – from the immediate aftermath of the horrendous terrorist atrocities in Paris in November 2015, where the press began questioning how intelligence services had failed to anticipate the attacks as soon as the “facts” leading up to them began to emerge, to football supporters who believe they have far greater expertise at picking the team than the manager, to the times when