Beth Kanter

The Smart Nonprofit


Скачать книгу

Cinthia Schuman, Chris Tuttle, Christopher Noessel, Darrell Malone, David A Colarusso, France Q. Hoang, Heejae Lim, Iain De Jong, Jake Garcia, Jake Maguire, Jill Finlayson, John Mayer, Julie Cordua, Kevin Bromer, Leah Post, Leila Toplic, Mohammad Radiyat, Nancy Smyth, Nick Bailey, Nick Hamlin, Ravindar Gujral, Rhodri Davies, Rita Ko, Shalini Kantayya, Steve MacLaughlin, Sue Citro, and Woodrow Rosenbaum.

      We want to give a special thanks to friends and colleagues who read parts of this book, answered questions, and gave us advice (when we asked for it and when we didn't!). In particular, we'd like to thank: Tamara Gropper, Mark Polisar, Lucy Bernholz, Johanna Morariu, Lisa Belkin, and Amy Sample Ward for their input and advice.

PART I UNDERSTANDING AND USING ARTIFICIAL INTELLIGENCE

      INTRODUCTION

      Leah Post has a keen sense of other people's pain. As a program manager at a Seattle social service nonprofit, she uses her gifts to help people who are homeless, or at high risk of homelessness, enter the local support system. An integral part of the intake process is a required assessment tool with the tongue-twisting name VI-SPDAT.

      Every day, Leah asked her clients questions from the VI-SPDAT and inputted their answers into the computer. And every day the results didn't match the picture of despair she saw in front of her, the results that should have made her clients top priorities for receiving emergency housing.

      Leah knew the basic statistics for the homeless population in King County, home to Seattle. Black people are 6% of the general population but over a third of the homeless population. For Native Americans or Alaska Natives that ratio is 1 to 10. Most of Leah's clients were Black, and yet time and again white applicants scored higher on the VI-SPDAT, meaning they would receive services first. Leah knew in her gut that something was wrong, and yet automated systems are supposed to be impartial, aren't they?

      Leah was not the only person noticing skewed results. Dozens of social workers joined her in signing a petition in Seattle asking for a review of the process. Other social workers around the country also raised concerns. Finally, researchers at C4 Innovations dug into the data from King County, as well as counties in Oregon, Virginia, and Washington, and found that BIPOC “were 32% less likely than their White counterparts to receive a high prioritization score, despite their overrepresentation in the homeless population.”

      The Department of Housing and Urban Development (HUD) provides funding for homelessness to local communities through Continuums of Care (CoCs) consortia of local agencies. This system was created in the 1990s to provide multiple access points for people who are homeless, or at risk of homelessness, through, say, food banks, homeless shelters, or mental health clinics.

      In 2009, HUD began to require CoCs to use a standardized assessment tool to prioritize the most vulnerable people. This was an important switch from the traditional “first come, first serve” model. The wait for emergency housing can be years long, and having an opportunity to get to the top of the list is a very big deal for clients. The choice of which tool to use was left up to each CoC.

      Years earlier, Community Solutions, a New York nonprofit specializing in using data to reduce homelessness, created the Vulnerability Index (VI) based on peer-reviewed research. The goal of the VI was to lower barriers for people with physical or mental health vulnerabilities that might prevent them from seeking services. Soon afterward, OrgCode Consulting, Inc., created the Service Prioritization Decision Assistance Tool (SPDAT). Finally, in 2013, OrgCode released a combination of these tools, the VI-SPDAT.

      You may be waiting for some bad guy to emerge in this story: a company gathering data to sell to pharmaceutical companies or a government agency intentionally blocking access to services. There will be stories like that later in this book, but this isn't one of them.

      And yet, the VI-SPDAT was so fundamentally flawed that OrgCode announced in 2021 that it would no longer recommend or support it.

      We use “smart tech” as an umbrella term for advanced digital technologies that make decisions for people, instead of people. It includes artificial intelligence (AI) and its subsets and cousins such as machine learning,