Lawrence S. Maisel

AI-Enabled Analytics for Business


Скачать книгу

System 2 (thinking slow) is the analytical, “critical thinking” way of making decisions. Most of us identify with System 2 thinking. We consider ourselves rational, analytical beings. Thus, we believe we spend most of our time engaged in System 2 thinking. Actually, we spend almost all of our decision-making engaged in System 1. Only if we encounter something unexpected, or if we make a conscious effort, do we engage System 2.

      System 1 thinking produces various forms of bias; several of the critical modes of bias more recognized by behavioral psychologists are discussed next:

       Inherent bias: One of the biggest problems with System 1 is that it seeks to quickly create a coherent, plausible story—an explanation for what is happening—by relying on associations and memories, pattern-matching, and assumptions. The amount and quality of the data on which the story is based are largely irrelevant. System 1 will default to a plausible, convenient story even if that story is based on incorrect information.For example, suppose a customer who usually orders a certain product places an order for an amount considerably less than expected. Management assumes the customer’s business is down, when, in fact, the competitor has captured the customer’s business. In effect, management has rationalized the event rather than seeking objective information on the cause.

       Hindsight bias: People will reconstruct a story around past events to underestimate the extent to which they were surprised by those future events. This is an “I knew it all along” bias. If an event comes to pass, people exaggerate the probability that they knew it was going to occur. If an event does not occur, people erroneously recall that they thought it was unlikely. In either case, these interpretations were based on subjective (biased) use of data.For example, revenue forecasts received from marketing indicated that product sales would grow even though last month’s sales were below budget. However, actual sales were materially below the forecast, upon which the executive says, “I knew it all along” in hindsight, but that concern was not acted upon when the forecast was accepted.

       Confirmation bias: People will be quick to seize on limited evidence that confirms their existing perspective. And they will ignore or fail to seek evidence that runs contrary to the coherent story they have already created in their mind.For example, imagine a business considering launching a new product. The CEO has an idea for the “next big thing” and directs the team to conduct market research. The team launches surveys, focus groups, and competitive analysis. However, to satisfy the CEO, the team seeks to confirm the idea, only accepting evidence to support the feasibility of the product and disregarding contradictory information.

       Noise bias: According to Kahneman and Sibony,7 noise is the variability when making judgments that go in different directions. For example, a company is building a new plant to manufacture its recently approved flu medicine. The plant is scheduled to be online in six months. The project team was asked to estimate (judge) when the first shipment could be expected, and the estimates ranged from 3 months ahead of schedule to 12 months behind schedule. This variability is the noise in judgment and significantly influences the decisions that follow and the operating impacts affected by these judgments.

      These examples illustrate the risks inherent in individual biases that can steer decision-making in the wrong direction. They also demonstrate the need for unbiased AI-enabled analytics input to be a powerful counterbalance to make more effective decisions that improve business performance.

      As humans, we cannot avoid our natural instinct that drives us to System 1 thinking for most of our daily lives. It is important for us to recognize when we are relying on it incorrectly for decision-making and the need to force System 2 thinking that incorporates AI and analytics as the preferred way to arrive at important business decisions and actions.

      The era of human judgment for decision-making needs to evolve into a process that is more objective, insightful, and unbiased. AI-enabled analytics is the vehicle to introduce in the decision process to support or contradict human bias. As such, organizations must adopt an Analytics Culture that values the need for data-driven decisions as essential to assure and improve business performance.

      1 1. The actual quote is, “Remember that all models are wrong; the practical question is how wrong do they have to be to not be useful”: George E.P. Box. Draper, N.R. (2007). Response Surfaces, Mixtures, and Ridge Analyses, 63. John Wiley & Sons.

      2 2. Daniel Kahneman, an Israeli-born psychologist, and Amos Nathan Tversky, an Israeli cognitive and mathematical psychologist, received the Nobel Prize for Economics in 2002 for their integration of psychological research into economic science. Their pioneering work examined human judgment and decision-making under uncertainty and the discovery of systematic human cognitive bias and handling of risk.

      3 3. Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus, and Giroux.

      4 4. The term was introduced by Jerry B. Harvey in a 1974 article, “The Abilene Paradox: The Management of Agreement.” The name of the phenomenon comes from an anecdote that Harvey uses in the article to elucidate the paradox.

      5 5. Schmidt, A. (2016). Groupthink. In: Encyclopedia Britannica. https://www.britannica.com/science/groupthink.

      6 6. Sims, R.R. (1994). Ethics and Organizational Decision Making: A Call for Renewal, 55–56. Greenwood Publishing Group.

      7 7. Adapted from Kahneman, D. and Sibony, O. (2021). Noise: A Flaw in Human Judgment. Hachette Book Group. As discussed with Kahneman and Sibony in: McKinsey. (2021). Sounding the alarm on system noise. Strategy & Corporate Finance Practice.

      Myths that are believed tend to become true.

      Similarly, there is the need to dispel some common knowledge about implementing AI-enabled analytics. One example is that implementing analytics is complex and expensive. However, as we discuss in detail in the chapters that follow, the Roadmap to AI-enabled analytics is not hard, long, or expensive—it is simply disciplined.

      Executives must take heed to avoid the “myths and misconceptions” about an Analytics Culture that include (i) data scientist misconception and myth, (ii) shot in the dark, (iii) bass-ackward, (iv) AI is not IT, (v) big is not better, and (vi) not now. As we will explain, data scientists, consultants, and IT all have roles in AI and analytics; but when implementing an analytic culture, the user is the primary player,