Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Is crime prediction software the way forward for modern policing? Or biased against minorities?

Is PredPol the 'holy grail' of crime prevention or are its secret algorithims targeting civil liberties and minority groups?

Justin Jouvenal
Tuesday 22 November 2016 07:56 EST
Comments
Sergent Coleman of the LAPD checks in with Jamie Bromley in her homeless encampment in the Pacoima neighborhood of Los Angeles
Sergent Coleman of the LAPD checks in with Jamie Bromley in her homeless encampment in the Pacoima neighborhood of Los Angeles (Patrick Fallon)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Sergeant Charles Coleman popped out of his police SUV and scanned a trash-strewn street popular with the city's homeless, responding to a crime that hadn't yet happened. It wasn't a 911 call that brought the Los Angeles Police Department officer to this spot, but a whirring computer crunching years of crime data to arrive at a prediction: a car theft or burglary would probably occur near here on this particular morning.

Hoping to head it off, Coleman inspected a line of ramshackle RVs used for shelter by the homeless, roused a man sleeping in a pickup truck and tapped on the side of a shack made of plywood and tarps. “How things going, sweetheart?” he asked a woman who ambled out. Coleman listened sympathetically as she described how she was nearly raped at knifepoint months earlier, saying the area was “really tough” for a woman.

Soon, Coleman was back in his SUV on his way to fight the next pre-crime. Dozens of other LAPD officers were doing the same at other spots, guided by the crime prognostication system known as PredPol. “Predictive policing” represents a paradigm shift that is sweeping police departments across the country. US law enforcement agencies are increasingly trying to forecast where and when crime will occur, or who might be a perpetrator or a victim, using software that relies on algorithms, using the same maths Amazon uses to recommend books.

“The hope is the holy grail of law enforcement – preventing crime before it happens,” said Andrew Ferguson, a University of the District of Columbia law professor preparing a book on big data and policing.

Now used by 20 of America's 50 largest police forces by one count, the technologies are at the centre of an increasingly heated debate about their effectiveness, potential impact on poor and minority communities, and the implications for civil liberties. Some police departments have hailed PredPol and other systems as instrumental in reducing crime, focusing scarce resources on trouble spots and individuals, and replacing officers' hunches and potential bias with hard data.

Predpol highlights possible sources of crime on a map for patrols. Red boxes on the maps highlight crimes likely to occur
Predpol highlights possible sources of crime on a map for patrols. Red boxes on the maps highlight crimes likely to occur (Patrick T. Fallon)

But privacy and racial justice groups say there is little evidence the technologies work and note the formulas powering the systems are largely a secret. They are concerned the practice could unfairly concentrate enforcement in communities of colour by relying on racially skewed policing data. Furthermore, they worry that officers who expect a theft or burglary is about to happen may be more likely to treat the people they encounter as potential criminals.

The experiments are one of the most consequential tests of algorithms that are beginning to figure increasingly in our lives, determining credit scores, measuring job performance and flagging children who might be victims of abuse. The White House has been studying how to balance the benefits and risks they pose.

“The technical capabilities of big data have reached a level of sophistication and pervasiveness that demands consideration of how best to balance the opportunities afforded by big data against the social and ethical questions these technologies raise,” the White House wrote in a recent report.

It was 6:45am on a Monday, but the sheet of paper Sgt Coleman held in his hands offered a glimpse of how the day might pan out: a car theft near Van Nuys and Glenoaks, a burglary at Laurel Canyon and Roscoe, and so on. The crime forecast is produced by PredPol at the beginning of each shift. Red boxes spread across Google maps of the San Fernando Valley, highlighting 500-by-500 sq ft locations where PredPol concluded property crimes were likely.

The forecast is cutting edge, but it is used in the service of an old-fashioned policing philosophy: deterrence. Between calls that day, Coleman and other officers were expected to spend time and engage with people in the roughly 20 boxes PredPol identified around the Foothill Division. Coleman sat behind the wheel of his SUV, plotting which boxes to hit the way someone consulting a weather map might weigh whether to bring an umbrella.

“It's not always that we are going to catch someone in the box, but by being there we prevent crime,” said Captain Elaine Morales, who oversees the Foothill Division. Foothill is far from the glitz of Hollywood on the northern edge of LA, but it has been at the centre of the transformation going on in US policing.

The division was one of the first in the US to adopt predictive policing five years ago and has helped refine PredPol. The technology has spread to other LAPD divisions and more than 60 other departments across the country, making it the country's most popular predictive policing system.

PredPol often draws comparisons to the movie Minority Report, in which a government unit rounds up future criminals who have not yet committed crimes, but one of the software's developers said it is not a crystal ball. Jeffrey Brantingham, a professor at UCLA, said crime often seems random, but it follows patterns.

“The question becomes, 'Can we build mathematical structures to understand these patterns?' The answer is yes, absolutely,” Brantingham said. “The best way to capture the way we think about crime patterns... is to think about earthquakes.” That's not just an analogy. Brantingham said a breakthrough moment in PredPol's development came when one of his partners realised an algorithm that described seismic activity could be used to predict crime.

Just as earthquakes happen along fault lines, Brantingham explained, research has shown crime is often generated by structures in the environment, like a high school, mall parking lot or bar. Additional crimes tend to follow the initial event near in time and space, like an aftershock.

Jefffrey Brantingham says crime often seems random but it follows patterns (Patrick T Fallon)
Jefffrey Brantingham says crime often seems random but it follows patterns (Patrick T Fallon) (Patrick T. Fallon)

PredPol uses years of crime data to establish these patterns, and then the algorithm uses near real-time crime data to predict the next property crime. Other systems use even more esoteric data – from the weather to phases of the moon – to arrive at their crime forecasts. But does it work?

Coleman fired up his SUV and headed out to a PredPol box as streaks of light poured over the dry hills surrounding LA. Parents walked their kids to school, and others rushed to work in the blue-collar, largely Latino community.

Coleman's SUV was one of only eight police cruisers circulating that morning in the Foothill, a 46-square-mile area that has a population of more than 180,000. It's easy to see why any system that could accurately pinpoint crime would be a major boon to the LAPD. For decades, police departments have mapped crimes using pushpins on paper maps and more recently blotchy hot-spot maps on computers. The maps always lagged behind crime on the street and could offer only general areas to focus patrols.

Coleman, a strapping and gregarious 26-year veteran of the LAPD, said he and other officers were initially skeptical that PredPol could anticipate a crime better than a seasoned officer – and do it in a box the size of a city block. But he quickly became a believer.

“If you spend three hours in that box the week after you had 10 crimes, the next week you are going to see three,” Coleman said as LA's low-slung houses, palm trees and strip malls slid by the car's windows.

LAPD Commander Sean Malinowski, who pioneered PredPol's use in the department, was also convinced. He relayed a story of how two of his officers found a thief in a stolen car in an area where PredPol predicted an auto theft. The person escaped, but the officers found him again – in another stolen vehicle in another box where PredPol forecast a theft.

But the data on the effectiveness of PredPol – and other predictive systems – presents a murkier picture. PredPol and the LAPD credit the system with helping bring about substantial reductions in property crime in the Foothill in 2012 through 2014, but crime has crept back up in the past couple of years as it has in the rest of Los Angeles.

A study by Brantingham and other researchers found the system was roughly twice as good at predicting where crime will occur as the LAPD's crime analysts and reduced crime 7 per cent, but no independent researchers have verified those claims or looked at PredPol. The only independent study of a place-based predictive-policing system found the software had no statistically significant impact on property crime in Shreveport, Louisiana. The system was one created by the researchers, not PredPol.

PredPol is just one example of predictive policing.

Police in Kansas City, Missouri, and Chicago maintain lists of hundreds of people who an algorithm predicted were likely to be involved in gun violence, either as perpetrators or victims. The calculations are based on arrests, gang affiliations and other variables. Police warn those on the list they are being watched, while social-service agencies offer help.

Sgt Coleman speaks with local people at a bus stop on his beat (Patrick T Fallon)
Sgt Coleman speaks with local people at a bus stop on his beat (Patrick T Fallon) (Patrick T. Fallon)

Chicago police said earlier this year that the department's system was effective – more than 70 per cent of the people who were shot and 80 per cent of those arrested for shootings in 2016 were on the list. But a Rand Corp study released in the summer found that individuals on a 2013 version of the list were no more or less likely to be the victim of a shooting than a comparison group. Police dispute the study's findings.

Inconclusive benefits are just one criticism in the heated debate over the systems as they become more widespread. Predictive policing has become a flash point in the discussion over race and policing that has inflamed the US in recent years. Malinowski and police officials elsewhere see PredPol and similar systems as a way to combat bias among officers by using data to guide patrols.

“Through the use of data, it's less subjective,” Malinowski said. “It's objective.” But the ACLU and 16 other groups issued a statement in August outlining a variety of concerns about predictive policing, saying such systems give a technological sheen to old patterns of policing.

David Robinson, a founder of the Upturn think-tank, wrote in a report that accompanies the statement that predictive policing could increase police presence in poor and minority communities by creating a “ratchet effect”. “The basic problem is those forecasts are only as good as the data they are based on,” Robinson said. “People in heavily policed communities have a tendency to get in trouble. These systems are apt to continue those patterns by relying on that biased data.”

Brantingham said it is a valid concern, but PredPol only uses data from crimes reported to the police and that have been verified. He said drug arrests and other offences that rely on the discretion of officers are not used because they are often more heavily enforced in poor and minority communities. Robinson also pointed out that the public – and even police who use the software – often do not know exactly how the systems are flagging particular locations or individuals. He said that makes accountability impossible.

Ferguson, the UDC law professor, said predictive policing raises a host of fundamental concerns and questions. He questioned how police will ensure the accuracy of the vast reams of data the systems rely on. An error could unfairly cast suspicion on a location or individual. Ferguson also wonders how predictive systems will affect officers. He anticipates forecasts will be used as a factor in officers' decisions to reach the “reasonable suspicion” threshold to stop people on the street and could affect the way officers approach stops.

“When you are told to be on the lookout for a particular crime in a particular place, that has to affect what you are going to do,” Ferguson said.

Sgt Coleman arrived at his next PredPol spot around 8:20am at a busy intersection. He chatted up three homeless people sitting on a bench at a bus stop, asking a question for which PredPol had already given him an answer: “How much crime occurs around here?”

© Washington Post

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in