that they consider.
March and Simon followed Simon's earlier work (1947) in critiquing the economic view of “rational man,” who maximizes utility by considering all available options and choosing the best. Instead, they argue that both individuals and organizations have limited information and limited capacity to process what they do have. They never know all the options. Instead, they gradually alter their aspirations as they search for alternatives. Home buyers often start with a dream house in mind, but gradually adapt to the realities of what's available and what they can afford. Instead of looking for the best option—“maximizing”––individuals and organizations instead “satisfice,” choosing the first option that seems good enough.
Organizational decision making is additionally complicated because the environment is complex. Resources (time, attention, money, and so on) are scarce, and conflict among individuals and groups is constant. Organizational design happens through piecemeal bargaining that holds no guarantee of optimal rationality. Organizations simplify the environment to reduce the demands on limited information‐processing and decision‐making capacities. They simplify by developing “programs”—standardized routines for performing repetitive tasks. Once a program is in place, the incentive is to stay with it as long as the results are marginally satisfactory. Otherwise, the organization is forced to expend time and energy to innovate. Routine tends to drive out innovation because individuals find it easier and less taxing to stick to programmed tasks (which are automatic, well‐practiced, and more certain of success). Thus, a student facing a term‐paper deadline may find it easier to “fritter”—make tea, straighten the desk, text friends, or browse the Web—than write a good opening paragraph. Managers may find it easier to sacrifice quality than change a familiar routine.
March and Simon's book falls primarily within the structural and human resource views. But their discussions of scarce resources, power, conflict, and bargaining recognize the reality of organizational politics. They emphasize framing, even though they do not use the word. Decision making, they argue, is always based on a simplified model of the world. Organizations develop unique vocabulary and classification schemes, which determine what people notice and respond to. Things that don't fit an organization's mind‐set are likely to be ignored or reframed into familiar terms the organization can understand.
When it becomes difficult to identify a promising suspect, a second popular option is to blame the bureaucracy. Things go haywire because organizations are stifled by rules and red tape or, as in the Trump White House, the opposite—chaos resulting from a lack of clear goals, authority, roles, and rules. The solution, then, is either to tighten up, loosen up … or pay the price.
By this reasoning, tighter financial controls could have prevented the subprime mortgage meltdown of 2008. The tragedy of 9/11 could have been thwarted if agencies had had better protocols for spotting such a terrorist attack. But piling on rules and regulations is a direct route to bureaucratic rigidity. Rules can inhibit freedom and flexibility, stifle initiative, and generate reams of red tape. The Commission probing the causes of 9/11 concluded: “Imagination is not a gift associated with bureaucracy.” When things become too tight, the solution is to “free up” the system so red tape and rigid rules don't stifle creativity and bog things down. An enduring storyline in popular films is the free spirit who triumphs in the end over silly rules and mindless bureaucrats (examples include the cult classics Office Space and The Big Lebowski). But many organizations vacillate endlessly between too loose and too tight.
A third fallacy attributes problems to thirsting for power. Enron collapsed, you could say, because key executives were more interested in getting rich and expanding their turf than in advancing the company's best interests. This view sees organizations as jungles teeming with predators and prey. Victory goes to the more adroit, or the more treacherous. You need to play the game better than your opponents—and watch your back.
Each of these three perspectives contains a kernel of truth but oversimplifies a knottier reality. Blaming people points to the perennial importance of individual responsibility. People who are rigid, lazy, bumbling, or greedy do contribute to some of the problems we see in organizations. But condemning individuals often distracts us from seeing system weaknesses and offers few workable options. If, for example, the problem is someone's abrasive or pathological personality, what do we do? Even psychiatrists find it hard to alter deeply entrenched character disorders, and firing everyone with a less‐than‐ideal personality is rarely a viable option. Training can go only so far in ensuring semi‐flawless individual performance.
The blame‐the‐bureaucracy perspective starts from a reasonable premise: organizations exist to achieve specific goals. They usually work better when strategies, goals, and policies are clear (but not excessive), jobs are well defined (but not constricting), control systems are in place (but not oppressive), and employees behave prudently (but not callously). If organizations always operated that way, they would presumably work a lot better than most do. This perspective is better at explaining how organizations should work than why they often fall short. Managers who cling to logic and procedures become discouraged and frustrated when confronted by intractable irrational forces. Year after year, we witness the introduction of new control systems, hear of new ways to reorganize, and are dazzled by emerging management strategies, methods, and gurus. Yet, as in the case of the Wuhan cover‐up, old problems persist, seemingly immune to every rational cure we devise. As March and Simon point out, subterranean features of organizations become salient when threatened. Like blaming individuals, dog‐eat‐dog logic offers a plausible analysis of almost anything that goes wrong. People both seek and despise power, but find it a convenient way to explain problems. Within hours of the 9/11 terror attacks, a senior FBI official called Richard Clarke, America's counterterrorism czar, to tell him that many of the terrorists were known members of Al Quaeda.
“How the fuck did they get on board then?” Clarke exploded.
“Hey, don't shoot the messenger. CIA forgot to tell us about them.”
In the context of its chronic battles with the CIA, the FBI was happy to throw a rival under the bus: “We could have stopped the terrorists if CIA had done their job.”
The tendency to blame what goes wrong on people, bureaucracy, or thirst for power is part of our deeply embedded mental wiring. They provide quick and easy explanations that enable us to feel we understand when we don't. There's much more to understanding a complex situation than assigning blame. Certain universal peculiarities of organizations make them especially difficult to understand or decipher.
PECULIARITIES OF ORGANIZATIONS
Human organizations can be exciting and challenging places. That's how they are often depicted in management texts, corporate annual reports, and fanciful management thinking. But, as many people find, they can also be deceptive, confusing, and demoralizing. It is a mistake to assume that organizations are either snake pits or rose gardens (Schwartz, 1986). Managers need to recognize characteristics of life at work that create opportunities for the wise as well as hidden traps for the unwary. A case from the public sector provides a typical example:
When Bosses Rush In
Helen Demarco arrived in her office to discover a news item from the local paper. The headline read, “Osborne Announces Plan.” Paul Osborne had arrived two months earlier as Amtran's new chief executive. His mandate was to “revitalize, cut costs, and improve efficiency.”
After 20 years, Demarco had achieved a senior management position at the agency. She had little contact with Osborne, but her boss reported to him. Demarco and her colleagues had been waiting to learn what the new chief had in mind. She was startled as she read the newspaper account. Osborne's plan made technical assumptions directly related to her area of expertise. “He might be a change agent,” she thought, “but he doesn't know much about our technology.” She immediately saw the new plan's fatal flaws. “If he tries to implement this, it'll be the worst management mistake since the Edsel.”
Two days later, Demarco and her colleagues received a memo instructing them to