Skip to main content

Mathematicians Urge Colleagues to Boycott Police Work in Wake of Killings

More than 1,400 researchers have signed a letter calling on the discipline to stop working on predictive-policing algorithms and other models.

Police officers at a Black Lives Matter rally in Los Angeles, California, in May,Gina Ferazzi/Los Angeles Times via Getty

The tide of reckoning on systemic racism and police brutality that has been sweeping through institutions — including scientific ones — has reached universities’ normally reclusive mathematics departments. A group of mathematicians in the United States has written a letter calling for their colleagues to stop collaborating with police because of the widely documented disparities in how US law-enforcement agencies treat people of different races and ethnicities. They concentrate their criticism on predictive policing, a maths-based technique aimed at stopping crime before it occurs.

The letter, dated 15 June, is addressed to the trade journal Notices of the American Mathematical Society (AMS), and comes in the wake of Black Lives Matter protests in the United States and around the world, sparked by the killing of George Floyd by a police officer in Minneapolis, Minnesota, in May. More than 1,400 researchers have now joined the call.

In recent years, mathematicians, statisticians and computer scientists have been developing algorithms that crunch large amounts of data and claim to help police reduce crime — for instance, by suggesting where crime is most likely to occur and focusing more resources in those areas. Software based on such algorithms is in use in police departments across the United States, although how many is unclear. Its effectiveness is contested by many.

But “given the structural racism and brutality in US policing, we do not believe that mathematicians should be collaborating with police departments in this manner”, the mathematicians write in the letter. “It is simply too easy to create a ‘scientific’ veneer for racism.”

“The activity of collaborating with the police is not something we feel a mathematician should be doing,” says co-author Jayadev Athreya, a mathematician at the University of Washington in Seattle. (He and the other writers emphasize that the letter represents their own views and not those of their employers.)

The AMS says that it “has no official position on mathematicians’ involvement in providing expertise to law-enforcement agencies, or to companies that do business with such agencies”.

Historical biases

The letter singles out as an example the company PredPol of Santa Cruz, California, which was co-founded by a mathematician. PredPol provides police departments with software that suggests geographical locations in which crime is likely to happen on any given day, on the basis of statistical patterns of previous crime.

Critics say that the data used to feed such algorithms contain racial biases. In addition, they say that the ‘control conditions’ for predictive policing — ordinary policing — are themselves racially skewed. “We have studies that show that certain crimes, say, drug consumption, is the same between white people and Black people for example,” says Sandra Wachter, a jurist at the University of Oxford, UK, who studies the legal and ethical implications of technology. “But in terms of who gets charged for those crimes — who gets stopped, or who gets convicted — there is a very strong racial bias.”

If you like this article, please sign up for Snapshot, Portside's daily summary.

(One summary e-mail a day, you can change anytime, and Portside is always free.)

But PredPol chief executive Brian MacDonald argues that in his company’s case, there is no risk that historical biases reflected in crime statistics would affect predictions, because the data the company uses are inherently less biased than other types of crime statistic.

Franklin Zimring, a criminologist at the University of California, Berkeley, says that the value of predictive-policing tools has not been proved, and that the software can lead to feedback loops. “If police presence itself is a biasing influence on measurable offence volume, you shouldn’t use those events as a basis for allocating police resources, or it’s a self-fulfilling prophecy,” Zimring says.

MacDonald argues, however, that PredPol uses only crimes reported by victims, such as burglaries and robberies, to inform its software. “We never do predictions for crime types that have the possibility of officer-initiated bias, such as drug crimes or prostitution,” he says.

That leaves the question of how effective the company’s technologies are. Last year, an external review that looked at eight years of PredPol use by the Los Angeles Police Department in California concluded that it was “difficult to draw conclusions about the effectiveness of the system in reducing vehicle or other crime”. A 2015 study published1 in the Journal of the American Statistical Association and co-authored by the company’s founders looked at two cities that had deployed its software, and showed that the algorithms were able to predict the locations of crimes better than a human analyst could.

A separate study by some of the same authors2 found that “there were no significant differences in the proportion of arrests by racial-ethnic group between control and treatment conditions”.

MacDonald also says that, unlike some other companies that work with law enforcement, PredPol discloses how its algorithms work, and undergoes audits. This might have exposed the company to more media scrutiny than others, and could be why it was singled out in the letter, says MacDonald. “As a result of being so open, we are more visible and therefore potentially more subject to criticism and scrutiny than other companies.”

Cancelled lecture

Wachter shares the letter writers’ concerns, but she adds that mathematicians should also engage with researchers in other disciplines to study the problem and suggest solutions. “A silos conversation is problematic,” Wachter says.

Andrea Bertozzi, an applied mathematician at the University of California, Los Angeles, who has worked on mathematical models of crime, agrees. “I think these are good discussions for people to have,” she says. Research done in the past decade shows that algorithms need to be used with care, and that researchers need to engage with the communities that are likely to be affected, Bertozzi adds.

Bertozzi co-authored the 2015 PredPol paper1 and has invested in the company. For this reason, some mathematicians were concerned by an early June announcement that she had been invited to give the 2021 Emmy Noether Lecture, a prestigious event sponsored by the AMS and the Association for Women in Mathematics (AWM) next January. Some commenters on social media said the timing of the announcement was offensive (it came during protests after Floyd’s killing). Bertozzi says that her lecture was not going to cover policing algorithms, but nevertheless, “I suggested to the AWM to cancel the talk, rather than generate division in the math community right now,” she says. “I have empathy towards people who are hurting.”

The AWM announced the reversal in an 11 June tweet. It was a mutual decision between the AMS, the AWM and Bertozzi, says Scott Turner, director of communications at the AMS, which is based in Providence, Rhode Island.

“AWM apologizes for our insensitivity in the timing of the announcement last week of the lecturer and the pain it caused,” the society wrote in a subsequent tweet. “We recognize that we have ongoing work to do in order to be an organization that fights for social justice and we are committed to doing what is necessary.”

doi: 10.1038/d41586-020-01874-9

References

  1. Mohler, G. O. et al. J. Am. Stat. Assoc. 110, 1399–1411 (2015).
  2. Brantingham, P. J., Valasik, M. & Mohler, G. O. Stat. Public Policy 5, 1-6 (2017).

Download references