Jörg Dräger

We Humans and the Intelligent Machines


Скачать книгу

about such automated decisions. Vacca was nevertheless satisfied because the commission has a clearly defined mandate: “If machines, algorithms and data determine us, they must at least be transparent. Thanks to the transparency law, we will have a better overview and understanding of algorithmic decision-making, and we will be able to make agencies accountable.”4 The trend towards more openness and regulation seems unstoppable.

      The legislative initiative has already stimulated a number of changes. The use of algorithms is now on New York’s public agenda – in the City Council, in the media, among the city’s residents. Algorithms are a political issue. A debate is taking place about what they are used for. And they are already used very broadly.

       In the service of safety

      It is not only 911 emergency calls but also computer messages that send New York police officers out on their next assignment.5 No crime has occurred at the scene assigned to the police by the software. According to the automated data analysis, however, the selected area is likely to be the site of car theft or burglary in the next few hours – crimes that could be prevented by increased patrols.

      Algorithms are managing law enforcement activities. In the 1990s, New York City was notorious for its high crime rate and gangsterism. Within one year, 2,000 murders, 100,000 robberies and 147,000 car thefts took place. New York was viewed as one of the most dangerous cities in the world. Politicians reacted. Under the slogan “zero tolerance,” tougher penalties and higher detection rates were meant to make clear: Crime does not pay.

      But what if modern technology could be used to prevent crime before it even occurs? The New York police force also considered this, although it initially sounded like science fiction. The Spielberg thriller Minority Report, based on the short story by Philip K. Dick, played the idea through in 2002: In a utopian society, serious crimes no longer happen because three mutants have clairvoyant abilities and reliably report every crime – a week before it is committed. Potential offenders are detained. Chief John Anderton, played in the movie by Tom Cruise, leads the police department and is proud of its results until one day his own name is spat out by the system. He is now considered a murderer-to-be and desperately tries to prove his innocence.

      In New York City, algorithms play the same role that the three mutants do for Dick and Spielberg: They provide crime forecasts. Yet with one decisive difference: The computer does not predict who will commit a crime in the near future but where it will take place. The term for this is “predictive policing.”

      And it works like this: Software evaluates the history of crime for each district of New York in recent years and compares the identified patterns with daily police reports. Crime may seem random at first glance, but in fact certain crimes such as burglary or theft adhere to patterns that can be worked out. These patterns depend on demographics, the day of the week, the time of day and other conditions. Just as earthquakes occur at the edges of tectonic plates, crime takes place around certain hot spots, such as supermarket parking lots, bars and schools. The predictive policing software marks small quadrants of 100 to 200 meters in length, where thefts, drug trafficking or violent crimes have recently taken place, which – according to the analysis – are often followed by other crimes.

      Since law enforcement officers started using predictive policing, their day-to-day work has changed. In the past, they were only called when a crime had already been committed and needed to be solved. Today, the computer tells them where the next crime is most likely to occur. In the past, they often took the same route every day, but now the software determines so-called crime hotspots where they need to be present to monitor what is going on. The police can thus better plan and deploy their resources and work more preventively. “The hope is the holy grail of law enforcement – preventing crime before it happens,” says Washington law professor Andrew G. Ferguson.6 New York Mayor Bill de Blasio sees this in a more pragmatic and less poetic way: Algorithmic systems, he argues, have made police work more effective and more trustworthy. The city is now safer and more livable.7 In fact, within 20 years the number of murders in New York City has fallen by 80 percent to only about 350 per year. Thefts and robberies also fell by 85 percent. It is not possible to determine exactly how much predictive policing has contributed to this. In any case, the software enables policemen to be where they are needed most.

      The specific functioning of the algorithms, however, remains hidden from the public: How do these programs work? What data do they collect? There are lawsuits pending against the New York police for violating the Freedom of Information Act. People have just as little knowledge about where the algorithms are used, the plaintiffs argue, as they do about how the calculations take place. The first court to hear the case ruled in favor of the plaintiffs. Nevertheless, the police continue to refuse to publish detailed information about their predictive policing.

      The New York Fire Department also prefers preventing fires to extinguishing them.8 But like the police, it struggles with limited resources. Not all of the 330,000 buildings in New York can be inspected every year. The firefighters must therefore set priorities and identify the buildings most at risk. But which ones are they? This selection process alone used to occupy an entire department. For a few years now, the firefighters have been using a computer program that algorithmically calculates the risk of each building catching fire. Taking into account the size, age, building material, pest infestation and inhabitant density as well as the history of fires in the neighborhood, the algorithm creates an inspection list for the next day (see Chapter 10).

       In the service of justice

      “Smaller, safer, fairer.”9 Using this motto, Mayor de Blasio presented his plan to close New York’s largest prison in June 2017.10 In the 1990s, most of the city’s then 20,000 prisoners were incarcerated on Rikers Island, once known as the new Alcatraz. By now, less than 10,000 New Yorkers are imprisoned and Rikers Island, which costs $800 million a year to run, is partly empty. Moreover, the prison has recently been shaken by a scandal about the mistreatment of a juvenile detainee. De Blasio therefore has several reasons for wanting to close the facility. He also would like to further reduce the number of prisoners: to 7,000 in five years and to 5,000 in the long term.

      His biggest lever: algorithms. They are supposed to help New York’s judges better assess risks, for example, whether pre-trial detention is necessary or whether an early release is adequate. The probabilities to be assessed here are, in the first case, the danger that the alleged person will flee before the trial and, in the second case, the threat of recidivism. These probabilities depend on so many factors that a judge can hardly be expected to evaluate all of them adequately in the time allotted for each case.

      COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) is the software that calculates the risk of flight and recidivism. While the company that developed the program refuses to publish the algorithm behind it, research by ProPublica, a non-profit organization for investigative journalism, has shown that such systems collect and analyze a large amount of data, such as age, gender, residential address, and type and severity of previous convictions. They even gather information on the family environment and on existing telephone services. All in all, COMPAS collects answers to 137 such questions.

      The potential for providing algorithmic support to judges is huge. In a study in New York City, researchers calculated that if prisoners with a low probability of recidivism were released, the total number of detainees could be reduced by 42 percent without increasing the crime rate.11 In Virginia, several courts tested the use of algorithms. They ordered detainment only in half as many cases as when judges issued a ruling without such software. Despite that, there was no increase in the rate of people who did not show up for their trial or who committed a crime in the interim.

      Algorithmically supported decisions improve forecasts even if they do not offer 100-percent accuracy. In addition, they could also reduce variations in the sentences handed down. In New York City, for example, the toughest judge requires bail more than twice as often as the most lenient of his colleagues. The