mind‐set. That gives us an incomplete picture, but we fill in the gaps to make everything fit with our current beliefs. Then, in order to act quickly instead of getting lost in thought, we favor the easy and obvious over the complex or difficult. We then code our experience into memory by discarding specifics and retaining generalities or by using a few specifics to represent a larger whole. This reinforces our current mental models, which then shape how we process experience in the future.
Exhibit 2.3. Cognitive Biases.
Cognitive Challenge | Solution | Risk |
---|---|---|
Too much data to process | Filter out everything except what we see as important and consistent with our current beliefs | Miss things that are important or could help us learn |
Tough to make sense of a confusing, ambiguous world | Fill in gaps, make things fit with our existing stories and mental models | Create and perpetuate false beliefs and narratives |
Need to act quickly | Jump to conclusions—favor the simple and obvious over the messy and complex | Quick decisions and actions lead to mistakes and get us in trouble |
Memory overload | Discard specifics to form generalities or use a few specifics to represent the whole | Error and bias in memory reinforce current mind‐sets and biases in information‐processing |
To a greater or lesser degree, we all rely on these cognitive short‐cuts. President Donald Trump regularly provided visible examples in his tweet storms and off‐the‐cuff communications. In March, 2017, he tweeted that his predecessor, Barack Obama was a “bad (or sick) guy” for tapping Trump's phones prior to the election. Trump apparently based this claim on an article from the right‐wing website Breitbart. Since the charge aligned with Trump's worldview, he figured it must be true and continued to insist he was right even after investigators concluded it never happened.
These biases and limits in human thinking and the complexity of human systems often lead us to act before we really understand what's going on. As one highly placed female executive reported to us, “I thought I'd covered all the bases, but then I suddenly realized that the rest of my team were playing football.” Faced with an unending barrage of puzzles or “messes,” managers first need to grasp an accurate picture of what is happening in the moment. Then they must move to a deeper level of understanding, asking, “What is really going on here?” This step omitted, managers too often rush to judgment, forming superficial analyses and pouncing on the solutions nearest at hand or most in vogue. Market share declining? Try strategic planning. Customer complaints? Put in a quality program. Profits down? Time to reengineer or downsize. A better alternative is to think, to probe more deeply into what is really going on, and to develop an accurate diagnosis. The ability to size up a situation quickly is at the heart of leadership. Admiral Carlisle Trost, former Chief of Naval Operations, once remarked, “The first responsibility of a leader is to figure out what is going on … That is never easy to do because situations are rarely black or white, they are a pale shade of gray … they are seldom neatly packaged.”
It all adds up to a simple truth that is easy to overlook. The world we perceive is an image we construct in our minds. Ellen Langer, the author of Mindfulness (1989), captures this viewpoint succinctly: “What we have learned to look for in situations determines mostly what we see” (Langer, 2009, p. 33). The ideas or theories we hold determine whether a given situation is foggy or clear, mildly interesting or momentous, a paralyzing disaster, or a genuine learning experience. Personal theories are essential because of a basic fact about human perception: in any situation, there is simply too much happening for us to attend to everything. We drown in a sea of complexity. To help us understand what is going on and what to do next, well‐grounded, deeply ingrained personal theories offer two advantages: they tell us what is important and what is safe to ignore, and they group scattered bits of information into coherent patterns. Mental models shape reality.
Research in neuroscience has called into question the old adage, “seeing is believing.” It has been challenged by its converse: “Believing is seeing.” The brain constructs its own images of reality and then projects them onto the external world (Eagleman, 2011). “Mental models are deeply held internal images of how the world works, images that limit us to familiar ways of thinking and acting. Very often, we are not consciously aware of our mental models or the effects they have on our behavior” (Senge, 1990, p. 8). Reality is therefore what each of us believes it to be. Shermer (2012) tells us that “beliefs come first, explanations for beliefs follow.” Once we form beliefs, we search for ways to explain and defend them. Today's experience becomes tomorrow's fortified theology.
In November 2014, two police officers in Cleveland received a radio report of a “black male sitting on a swing pulling a gun out of his pants and pointing it at people” in a city park (Holloway, 2015). Arriving at the site, one officer spotted the suspect and saw him reach for his gun. The officer immediately shot and killed the suspect. The officer might have responded differently if the radio report had included two additional details. The caller who made the initial report had said that the suspect might be a juvenile, and the gun was probably fake. The gun was a toy replica of a Colt semiautomatic pistol. The victim, Tamir Rice, was 12 years old, but, at 195 pounds, might have looked like an adult on a quick glance. The officer who shot him was a rookie who had been hired in Cleveland after he was forced out of a suburban department which rated him as unqualified for police work (Flynn, 2016).
Perception and judgment involve matching situational cues with previously learned mental models. In this case, the perceptual data were ambiguous, and expectations were prejudiced by a key missing clue—the radio operator had never mentioned the possibility of a child with a toy. The officer was expecting a dangerous gunman, and that is what he saw.
Impact of Mental Models
Changing old patterns and mind‐sets is difficult. It is also risky; it can lead to analysis paralysis, confusion, and erosion of confidence. This dilemma is with us even if we see no flaws in our current thinking because our theories are often self‐sealing. They block us from recognizing our errors. Extensive research documents the many ways in which individuals spin reality to protect existing beliefs (see, for example, Garland, 1990; Staw and Hoang, 1995). In one corporate disaster after another, executives insist that they were not responsible but were the unfortunate victim of circumstances. After the mob invasion of the U.S. Capitol in January 2021, no one felt personally responsible and the blame game was in full swing.
Extensive research on the “framing effect” (Kahneman and Tversky, 1979) shows how powerful subtle cues can be. Relatively modest changes in how a problem or decision is framed can have a dramatic impact on how people respond (Gegerenzer, Hoffrage, and Kleinbölting, 1991; Shu and Adams, 1995). One study found that doctors responded more favorably to a treatment with “a one‐month survival rate of 90 percent” than one with “a 10 percent mortality rate in the first month,” even though the two are statistically identical (Kahneman, 2011).
Many of us sometimes recognize that our mental models or maps influence how we interpret the world. It is less widely understood that what we expect often determines what we get. Rosenthal and Jacobson (1968) studied schoolteachers who were told that certain students in their classes were “spurters”—students who were “about to bloom.” The so‐called “spurters,” who had been randomly selected, achieved above‐average gains on achievement tests. They really did spurt. Somehow, the teachers' expectations were communicated to and assimilated by the students. Medical science is still probing the placebo effect—the power of sugar pills to make people better (Hróbjartsson