future communications. Instead of “astroturfing” support for climate change, advocates could use the dividend of time to engage with supporters and get to know them, educate them on the issue, and teach them how to become advocates and create their own group of supporters.
More time to think: People and organizations are so busy doing the work, and the work to support the work, that there is very little time for reflection on how the work is done and how to improve it. Imagine having time to consider other ways to do intake with clients rather than furiously responding to a barrage of inquiries every day? Imagine having time to talk to supporters about what kind of support they would need to gather their friends and help them become ambassadors? Imagine having time to just think.
A REAL-WORLD SMART NONPROFIT
TalkingPoints is a great example of a smart nonprofit.
Heejae Lim, founder of TalkingPoints, didn't need to do any research about the difficulty immigrant parents have navigating school systems; she lived it as the child of immigrants. Her family moved from Korea to London when she was eight. Heejae had an advantage that many of her immigrant friends didn't: her mother spoke English well. Heejae's mother was a fierce advocate for her at school. She also translated for the parents of her friends.8
Following business school at Stanford University, Heejae decided to do what she does best: address a difficult problem using her advanced technology know-how. She founded TalkingPoints as a nonprofit to translate messages between teachers and parents.
About a quarter of all school children in the United States speak a language other than English at home. These are families where parents often work multiple jobs, may come from cultures where parents are not supposed to engage with teachers, and most importantly, do not speak English well enough to feel comfortable speaking to teachers.
Family engagement with schools is not a minor issue when it comes to educational outcomes. It is twice as effective in predicting school success than socioeconomic status. Let that sink in for a second: for school success matters much less whether a child is from a high-income home than whether adults in a child's life talk to their teachers. However, family-school engagement occurs up to 50% less among families in under-resourced, multilingual communities.9
TalkingPoints’ app works like text messaging on mobile devices. It operates in over 100 languages, provides closed captioning for video messages for parents who may not be comfortable writing, and enables parents and guardians to engage with teachers during the cracks of their very busy days.
TalkingPoints started with a simple proof of concept and a small group of families. Using a Google spreadsheet, a text messaging app, and human translators, Heejae and her team simulated an automated process from end to end. Here's how it worked. A parent sent a text message in their native language. Heejae added it to the spreadsheet. Next, a human translator translated the message into English. Heejae texted the translated message to the teacher or administrator. And back and forth they went. Heejae told us, “We always start with a proof of concept with a small group before we build, as part of doing no harm.”
The team learned a lot from this early testing and automated and launched the pilot version in 2015 with 100 families. Most importantly, they learned they couldn't rely on off-the-shelf translation tools alone because these tools often misinterpreted context and cultural norms. The volunteer translators review the machine translation to ensure cultural accuracy and that educational terms are accurately translated. These translations are being fed back into the model to continuously improve it. The aim is to build a large and deep enough database of translations to direct only the difficult-to-understand conversations to human translators.
TalkingPoints meets our definition of smart nonprofits: organizations that are human-centered, prepared, and knowledgeable and reflective. People are deeply engaged in the process, and the team designed and are implementing the app carefully and thoughtfully; they have done their homework about cultural competence, education, and the needs of immigrant families; the staff includes people with experts in education and tech; and everyone has experience working with immigrant families and schools.
The results show the effectiveness of working this way.
By 2019, the app facilitated over 20 million conversations for 500,000 parents and teachers. TalkingPoints is also free for users. An outside research firm was engaged to evaluate the effort. It found that:
89% of the schools using the app serve low-income children.
97% of teachers have found TalkingPoints helpful in meeting their family engagement goals.
98% of teachers were able to reach families they had never reached before.
83% of teachers believe that they are more informed about students’ needs because of their relationships with families.10
VI-SPDAT blocked access to services. TalkingPoints creates access for a woefully underserved population. These cases highlight why it is so important for organizations to step carefully into the use of smart tech.
THE DANGERS OF AUTOMATION
Nicholas Carr wrote in The Glass Cage, “Automation severs ends from means. It makes getting what we want easier, but it distances us from the work of knowing.”11
There is enormous danger and damage to be done in distancing ourselves from knowing. It means potentially cutting ourselves off from the needs of clients if they are first interacting with bots screening them for services. It could mean using automation to send out many times more fundraising appeals and not listening to the complaints from current and prospective donors. It could mean hiding behind screens instead of stepping out to build stronger relationships with constituents. And it could mean allowing an insidious form of racism and sexism to take hold unabated inside your organization.
We tend to see work done by computers and robots as incapable of being swayed by emotions and therefore incapable of being racist, sexist, or otherwise biased or unfair. However, the code that powers smart tech was at some point created by people and carries forward their opinions, assumptions, and biases. When this code makes decisions that are discriminatory, we call it embedded bias. The renowned data scientist Cathy O'Neil says, “Algorithms are opinions embedded in code.”12
Embedded biases are very difficult to undo. Programmers make literally thousands of choices beneath the hood of smart tech that the rest of us can't see. Automation is increasingly being used to make vital and life-changing decisions for people. Therefore, the choices that programmers (overwhelmingly white men) make, based on their own experiences and backgrounds, become more important.
For instance, smart tech is increasingly used to screen applications for mortgages. It is illegal to ask questions about, say, race, in these applications, so programmers create “proxies,” or substitute questions, to create a profile of an applicant. For instance, a zip code could be used as a proxy for “safe neighborhood.” Safe generally means white, particularly for white programmers using their own life experiences. In addition, the data is needed to train smart tech systems. An automated mortgage screening process will use data from the enormous data sets from past mortgage application decisions. Black people were historically denied mortgages at astonishing rates and therefore will be woefully underrepresented in these data sets. In this way, seemingly benign programming decisions, mundane proxies, and historic data create embedded biases against people of color that is difficult to see from the outside.
Once bias is baked into smart tech, it stays there forever and