Police cease using algorithm that predicts future violent behaviour

portretfoto van Marc Schuilenburg

Following recent critical reporting by Follow the Money regarding the application of the Violence Risk Assessment Tool (RTI-G), the police have immediately stopped using this algorithm. Marc Schuilenburg, Professor of Digital Surveillance at Erasmus School of Law, gained access to the documents outlining the algorithm’s rationale and expressed his concerns about this investigative tool to Follow the Money. Not only has its effectiveness not been evaluated by the police, but the risks associated with its use are also very high. Additionally, ethical and legal concerns constitute significant criticism, according to Schuilenburg.

Since 2015, the Dutch police have used an algorithm to predict on an individual level whether someone will engage in violence in the future. According to research by Follow the Money, individuals with Antillean, Moroccan, or Somali backgrounds consistently received higher scores until mid-2017 compared to individuals of other origins. Despite the police supposedly removing this bias from the algorithm, Schuilenburg argues that such an algorithm can only lead to problems.

Serious shortcomings

The professor is highly critical of RTI-G: “There are enormous risks involved.” Firstly, because making reliable crime predictions on a geographical level is already very challenging, doing so on an individual level is even more unlikely to succeed, according to Schuilenburg: “This is much more complex. Countless factors come into play. Every individual is different.”

Labelling someone as a risk profile grants the police various intrusive investigative powers, such as preventive searches and searching the vehicle of the individual in question. When the police intend to limit a citizen’s freedom severely, thorough justification is necessary. According to Schuilenburg, the rationale in the algorithm’s accountability document falls short: “the input, processing, and output. It all falls short.” Schuilenburg asserts that the document lacks proper information about the used data, the selection of risk factors, the weighting, how the model was validated, and the control for bias. Moreover, no mid-term evaluation is included in the document: “The risk factors in this instrument were conceived ten years ago, but they have never been revisited. It simply can not go on like this.”

Precautionary society

Schuilenburg investigates why the government uses such predictive algorithms and how they operate: “In politics and society, the focus is on preventing potential risks. In the past, there was suspicion first, then surveillance. Now, there is surveillance first, then suspicion.”

According to Schuilenburg, instruments like this police algorithm can emerge because safety, effectiveness, and efficiency in our society outweigh transparency, non-discrimination, and algorithmic accountability. This leads to something like RTI-G: “This is not something you should desire. Not just ethically, but also legally. This risk model squarely falls within the definition of “high-risk” in the future Artificial Intelligence regulation of the European Union. It clashes with the requirements outlined there.”

Marc Schuilenburg is pleased that the police have ceased using the Violence Risk Assessment Tool. Doubts exist regarding whether the so-called Violence Risk Assessment Tool is useful, and it is unclear whether officers were still using it at all, states a police spokesperson following Follow the Money’s report.

Professor
More information

Click here for the RTI-G investigation by Follow the Money (in Dutch).

Click here for Follow the Money’s article about the police discontinuing RTI-G (in Dutch).

Related content
On 23 June 2023, Marc Schuilenburg officially assumed his position as Professor of Digital Surveillance with his inaugural lecture.
portretfoto van Marc Schuilenburg

Compare @count study programme

  • @title

    • Duration: @duration
Compare study programmes