How can we help a crime

Predictive policing How algorithms help fight crime

Preventing crimes before they are done - a desire that resonates with the police and the population. Two experts explain whether and how forecasting systems for predictive policing can help.

Cop John Anderton has a problem: he works for the Precrime division of the Washington Police Department in the United States and usually arrests people who are about to commit murder. Now the system predicts that he himself will soon commit murder. Is the prognosis that is considered error-free wrong? Or is a policeman not immune to committing a cowardly murder?
 
The plot of the US feature film Minority Reportfrom 2002 serves as a much-cited example of the idea of ​​“pre-crime” - so to speak, preventing crime before a criminal offense - and predictive policing, which, according to a study by the University of Hamburg, translates as “predictive policing”. Central to this are algorithms that take over the operational prediction of crimes.
 
 
The image of predictive policing is usually rather gloomy in the media: The main fear is false prognoses or spying, even an all-controlling police and surveillance state. Predictive policing has arrived in German police authorities since 2016 at the latest - with Minority Reporthowever, it has little to do with it. “One night I was standing in front of a card in which the police officers had stuck needles,” says Thomas Schweer. The sociologist is the managing director of the Institute for Pattern-Based Forecasting Technology (IfmPt) in Oberhausen. "At that time I said: We are living in the year 2000. In 1969 we flew to the moon and we were still sticking needles in a card in 2000?" The IfmPt developed - "by officials for officials", according to Schweer - the groundbreaking police prognosis -Software PRECOBS, which is used in Germany and Switzerland.
 

How does PRECOBS work?

In greatly simplified terms, the PRECOBS software works roughly like the purchase suggestions from online retailers: “Customers who bought XY also bought….” Based on data, the software searches for patterns that can be used to predict future behavior. The more data the system knows, the more precisely suggestions can be made.
 
So if there is a break-in in the PRECOBS area, the police officers enter selected data into the system - for example: How was the break-in? At what time? What was stolen? In which area is the crime scene located? It is then the task of PRECOBS to make a prognosis about the area in which the next break-in will be stochastically very likely. The scientific basis is the so-called near repeat approach: If a break-in occurs in a certain area, a break-in in the same area is very likely in the near future. If the system creates a forecast, the officers drive into the area and look around: Are there any suspicious vehicles? Are suspicious people sneaking around? Is something inconsistent on site?
 
The forecasting performance of PRECOBS and comparable systems is currently (still) limited to break-ins, the actual use of the software is controversial. At the end of 2020, however, some federal states had police forecast software that was either purchased from external service providers or created within the authority.

PRECOBS specializes in mass crime such as home break-ins (see info box) and is intended to predict the likelihood of the next break-in in an area - its success is controversial. In an evaluation, “only a very limited use” of PRECOBS could be determined, says the criminologist Dominik Gerstner. He works at the Max Planck Institute for Research into Crime, Security and Law in Freiburg im Breisgau, which carefully examined the IfmPt software between 2015 and 2016. The system is not superfluous - but the effects for police work “are only in a moderate range”, according to Gerstner. Developer Schweer says about the evaluation: “If you read the study, you will find many references to the functionality of PRECOBS. Only the conclusion is not well done. "For him, the result is by no means a setback:" Wait for the time. "

Predictive policing on the advance worldwide

What Schweer could be right about. Police forecast software is already in use in Great Britain, South Africa, Brazil, Switzerland and the Netherlands. China is also experimenting with “Sesame Credit” - a social scoring system that is reminiscent of the German SCHUFA, but incorporates even more ranking factors. The US American predictive policing is viewed particularly critically in Germany. For example, in the “Strategic Subject List” (also known as the “Heat List”) in Chicago, it is no longer possible to clearly differentiate between suspects and potential victims of criminal offenses. It is also unclear for those affected how they get on the list - and whether they will ever be deleted from it.
 
 
In the Californian city of Fresno, on the other hand, before an emergency call, officers find out from the “Beware” forecasting system whether their counterpart has a criminal record at the scene or whether they have a firearm. “That then mainly influences the actions of the police officers,” says Gerstner. “Whether they are more likely to ring the doorbell or stand in front of the front door with their guns drawn. That can of course make a big difference in the USA. ”The use of personal data in predictive policing - for example, place of residence, bank account or marital status - as well as racist-ethnic stereotypes and prejudice of poorer people by the police are also problematic. Personal data is also collected in Germany - but currently only in relation to so-called Islamist "threats". Gerstner and Schweer both emphasize that software can also deliver incorrect results, for example due to missing or incorrect data or simply input errors. Such constellations could then reproduce within a system with serious consequences for those affected.

Digitization boost instead of dystopia

For Germany, however, the following applies first: Predictive policing is more of a digitalization push than dystopian science fiction. For the civil servants, the software made work easier, according to Gerstner, even if some federal states use different solutions. Police officers can now collect and evaluate much structured data for certain criminal offenses and recognize conspicuous patterns in good time. With it you come “before the situation”, as it is called according to Schweer in the police jargon - so you can foresee which demands the emergency services have to cope with.
 
The introduction also made it clear that the police also need modern computers, smartphones and software for their work that they can understand and operate correctly. Equipping with modern technology is not a matter of course in Germany, says Schweer - as is the interaction between people and data: "You also have to take reservations," says the sociologist. “Now there is software that tells me where to go. This must not be a black box for the officer and must also be technically comprehensible. ”The experts agree that predictive policing is not a“ magic crystal ball ”in which“ crime trends ”or even future criminal offenses can be read. Many investigations - including cybercrime - could be carried out well using traditional methods. Rather, predictive policing instruments are decision-making and investigative aids for civil servants who can save staff and costs.

A sleeping giant - police databases

But German predictive policing is not harmless. The German security authorities have enormous amounts of data at their disposal, even if they are slumbering on quasi-isolated "data islands". The economical links between them appear out of date, but also protect the data stored there against access. However, instruments such as PRECOBS increase the pressure on police work to handle precisely this type of data and to work with it as raw material. It also becomes explosive in terms of data protection and risk prognoses if additional data is purchased from outside - for example from geo-marketing, as Gerstner says. So far it is unclear how exactly this data would affect predictive policing software and whether - as in the USA - it could lead to stronger prejudice of certain social groups.

The criminologist Dominik Gerstner | Photo: © Max Planck Institute for Research into Crime, Security and Law

Public debates about predictive policing remain weak

Cooperation between private sector providers and public authorities can also create friction. The state of Hesse, for example, uses forecasting software based on the Gotham product from the US company Palantir. Palantir also works with US agencies such as the CIA, NSA, and the FBI. "Gotham is complex, very different data sources are fed in," says criminologist Gerstner. “There are a lot of algorithms in there that are a trade secret. That is of course a problem for the authorities. ”Buying such software from taxpayers’s money must be profitable for the authorities. This creates pressure to be able to demonstrate success in the application. But what if data protection has to be weakened more so that the “correct” results can be delivered quickly?
 
Last but not least, predictive policing instruments already exist in some federal states - the public discussion about them is rather weak. In addition to data protection problems, critics fear above all that legal pillars such as the presumption of innocence and the prohibition of discrimination through predictive policing instruments could break away. So far, questions about the potential of the prognoses have also been unanswered - for example, how can non-committed crimes actually be measured correctly? In addition, internally among police officers, the arrest rate is still a success factor for their work - how do officers deal with these expectations in the future? And what role does the software play in determining the arrest rate? Just as fundamental is the question of why instruments such as predictive policing are needed - because the crime rate in Germany is falling rather than rising.

So do we just feel more insecure or are there really more threats? And: In which social class do we feel more threatened - and in which more criminalized? One of the most contentious points is also the development of software that is supposed to track down radicalized attackers. But what if a potential terrorist attack could only be prevented by the remote-controlled killing of such a target? And what if the prognosis was wrong in just one case?

Jan. 14, 2021

Sylvia Lundschien

Observing and writing down are basic biographical constants for Sylvia Lundschien - but it took a comparatively long time to turn this into a profession. Before her training at the Protestant School of Journalism in Berlin, she studied European ethnology, Russian and intercultural communication in Berlin, Moscow and Frankfurt (Oder). Today she works as a freelance journalist in Berlin and is interested in a wide range of subjects from science, feminism, international politics and online curiosities.

Copyright: This text is licensed under Creative Commons Attribution - Noncommercial - No Adaptations 4.0 International (CC BY-NC-ND 4.0).