Lee G. Bolman

Reframing Organizations


Скачать книгу

mind‐set. That gives us an incomplete picture, but we fill in the gaps to make everything fit with our current beliefs. Then, in order to act quickly instead of getting lost in thought, we favor the easy and obvious over the complex or difficult. We then code our experience into memory by discarding specifics and retaining generalities or by using a few specifics to represent a larger whole. This reinforces our current mental models, which then shape how we process experience in the future.

Cognitive Challenge Solution Risk
Too much data to process Filter out everything except what we see as important and consistent with our current beliefs Miss things that are important or could help us learn
Tough to make sense of a confusing, ambiguous world Fill in gaps, make things fit with our existing stories and mental models Create and perpetuate false beliefs and narratives
Need to act quickly Jump to conclusions—favor the simple and obvious over the messy and complex Quick decisions and actions lead to mistakes and get us in trouble
Memory overload Discard specifics to form generalities or use a few specifics to represent the whole Error and bias in memory reinforce current mind‐sets and biases in information‐processing

      To a greater or lesser degree, we all rely on these cognitive short‐cuts. President Donald Trump regularly provided visible examples in his tweet storms and off‐the‐cuff communications. In March, 2017, he tweeted that his predecessor, Barack Obama was a “bad (or sick) guy” for tapping Trump's phones prior to the election. Trump apparently based this claim on an article from the right‐wing website Breitbart. Since the charge aligned with Trump's worldview, he figured it must be true and continued to insist he was right even after investigators concluded it never happened.

      It all adds up to a simple truth that is easy to overlook. The world we perceive is an image we construct in our minds. Ellen Langer, the author of Mindfulness (1989), captures this viewpoint succinctly: “What we have learned to look for in situations determines mostly what we see” (Langer, 2009, p. 33). The ideas or theories we hold determine whether a given situation is foggy or clear, mildly interesting or momentous, a paralyzing disaster, or a genuine learning experience. Personal theories are essential because of a basic fact about human perception: in any situation, there is simply too much happening for us to attend to everything. We drown in a sea of complexity. To help us understand what is going on and what to do next, well‐grounded, deeply ingrained personal theories offer two advantages: they tell us what is important and what is safe to ignore, and they group scattered bits of information into coherent patterns. Mental models shape reality.

      Research in neuroscience has called into question the old adage, “seeing is believing.” It has been challenged by its converse: “Believing is seeing.” The brain constructs its own images of reality and then projects them onto the external world (Eagleman, 2011). “Mental models are deeply held internal images of how the world works, images that limit us to familiar ways of thinking and acting. Very often, we are not consciously aware of our mental models or the effects they have on our behavior” (Senge, 1990, p. 8). Reality is therefore what each of us believes it to be. Shermer (2012) tells us that “beliefs come first, explanations for beliefs follow.” Once we form beliefs, we search for ways to explain and defend them. Today's experience becomes tomorrow's fortified theology.

      Perception and judgment involve matching situational cues with previously learned mental models. In this case, the perceptual data were ambiguous, and expectations were prejudiced by a key missing clue—the radio operator had never mentioned the possibility of a child with a toy. The officer was expecting a dangerous gunman, and that is what he saw.

      Impact of Mental Models

      Changing old patterns and mind‐sets is difficult. It is also risky; it can lead to analysis paralysis, confusion, and erosion of confidence. This dilemma is with us even if we see no flaws in our current thinking because our theories are often self‐sealing. They block us from recognizing our errors. Extensive research documents the many ways in which individuals spin reality to protect existing beliefs (see, for example, Garland, 1990; Staw and Hoang, 1995). In one corporate disaster after another, executives insist that they were not responsible but were the unfortunate victim of circumstances. After the mob invasion of the U.S. Capitol in January 2021, no one felt personally responsible and the blame game was in full swing.

      Extensive research on the “framing effect” (Kahneman and Tversky, 1979) shows how powerful subtle cues can be. Relatively modest changes in how a problem or decision is framed can have a dramatic impact on how people respond (Gegerenzer, Hoffrage, and Kleinbölting, 1991; Shu and Adams, 1995). One study found that doctors responded more favorably to a treatment with “a one‐month survival rate of 90 percent” than one with “a 10 percent mortality rate in the first month,” even though the two are statistically identical (Kahneman, 2011).