Beth Kanter

The Smart Nonprofit


Скачать книгу

future communications. Instead of “astroturfing” support for climate change, advocates could use the dividend of time to engage with supporters and get to know them, educate them on the issue, and teach them how to become advocates and create their own group of supporters.

       More time to think: People and organizations are so busy doing the work, and the work to support the work, that there is very little time for reflection on how the work is done and how to improve it. Imagine having time to consider other ways to do intake with clients rather than furiously responding to a barrage of inquiries every day? Imagine having time to talk to supporters about what kind of support they would need to gather their friends and help them become ambassadors? Imagine having time to just think.

      TalkingPoints is a great example of a smart nonprofit.

      Following business school at Stanford University, Heejae decided to do what she does best: address a difficult problem using her advanced technology know-how. She founded TalkingPoints as a nonprofit to translate messages between teachers and parents.

      About a quarter of all school children in the United States speak a language other than English at home. These are families where parents often work multiple jobs, may come from cultures where parents are not supposed to engage with teachers, and most importantly, do not speak English well enough to feel comfortable speaking to teachers.

      TalkingPoints’ app works like text messaging on mobile devices. It operates in over 100 languages, provides closed captioning for video messages for parents who may not be comfortable writing, and enables parents and guardians to engage with teachers during the cracks of their very busy days.

      The team learned a lot from this early testing and automated and launched the pilot version in 2015 with 100 families. Most importantly, they learned they couldn't rely on off-the-shelf translation tools alone because these tools often misinterpreted context and cultural norms. The volunteer translators review the machine translation to ensure cultural accuracy and that educational terms are accurately translated. These translations are being fed back into the model to continuously improve it. The aim is to build a large and deep enough database of translations to direct only the difficult-to-understand conversations to human translators.

      TalkingPoints meets our definition of smart nonprofits: organizations that are human-centered, prepared, and knowledgeable and reflective. People are deeply engaged in the process, and the team designed and are implementing the app carefully and thoughtfully; they have done their homework about cultural competence, education, and the needs of immigrant families; the staff includes people with experts in education and tech; and everyone has experience working with immigrant families and schools.

      The results show the effectiveness of working this way.

       89% of the schools using the app serve low-income children.

       97% of teachers have found TalkingPoints helpful in meeting their family engagement goals.

       98% of teachers were able to reach families they had never reached before.

       83% of teachers believe that they are more informed about students’ needs because of their relationships with families.10

      VI-SPDAT blocked access to services. TalkingPoints creates access for a woefully underserved population. These cases highlight why it is so important for organizations to step carefully into the use of smart tech.

      Embedded biases are very difficult to undo. Programmers make literally thousands of choices beneath the hood of smart tech that the rest of us can't see. Automation is increasingly being used to make vital and life-changing decisions for people. Therefore, the choices that programmers (overwhelmingly white men) make, based on their own experiences and backgrounds, become more important.

      For instance, smart tech is increasingly used to screen applications for mortgages. It is illegal to ask questions about, say, race, in these applications, so programmers create “proxies,” or substitute questions, to create a profile of an applicant. For instance, a zip code could be used as a proxy for “safe neighborhood.” Safe generally means white, particularly for white programmers using their own life experiences. In addition, the data is needed to train smart tech systems. An automated mortgage screening process will use data from the enormous data sets from past mortgage application decisions. Black people were historically denied mortgages at astonishing rates and therefore will be woefully underrepresented in these data sets. In this way, seemingly benign programming decisions, mundane proxies, and historic data create embedded biases against people of color that is difficult to see from the outside.