An AI “at the Minority Report” capable of predicting murders? It is no longer a fiction in the United Kingdom

Deal Score0
Deal Score0

It is in all discretion that the British Ministry of Justice is currently working on the development of an algorithm designed to identify people likely to become killers, at least before information reported by The Guardian Give a spotlight on this tool called “Homicide prediction project”.

The latter feeds on data from the United Kingdom Police Forces, which potentially includes information on victims and witnesses, as well as on suspects.

Advertising, your content continues below

Reading this “pitch”, many of you will no doubt have thought of the science fiction film Minority Reportwhere individuals predicted crimes before they occur. This reality, which seemed highly diastolic, not so long ago, is therefore becoming an administrative and judicial reality in the United Kingdom.

Of course, the use of predictive algorithms in the field of criminal justice is not new. But its application to the prediction of crimes as serious as homicides is unprecedented.

Statewatch's revelations

The organization of surveillance of civil liberties Statewatch thus discovered the existence of this program thanks to requests made within the framework of the law on freedom of information (Freedom of Information Act). According to the documents obtained by this group, the program has developed its prediction tool based on police data concerning between 100,000 and 500,000 people.

Even more worrying, the different categories of information shared with the Ministry of Justice also seem to cover sensitive subjects such as mental health, addictions, suicidal trends and handicaps. Obviously, the extent of data collection and the sensitive nature of the information used has not failed to raise serious questions about compliance with privacy and the protection of personal data.

Advertising, your content continues below

These revelations show in total that the British government seems to invest substantial resources in this algorithmic approach to crime prevention with regard to the considerable extent of the project.

Biases inherent in AI

Unsurprisingly, experts in civil liberties and digital ethics are worried about this project. Sofia Lyall, researcher at Statewatch, said: “Most and over time, research shows that algorithmic systems of 'prediction' of crimes are intrinsically defective. This last model, which uses data from our police and the institutionally racist interior ministry, will only strengthen and amplify the structural discrimination underlying the criminal legal system.“”

This criticism points to a recurring problem in predictive justice systems: algorithms are trained on historical data which often reflects existing biases and inequalities in the judicial system. Consequently, these systems are likely to perpetuate, even amplify, these biases rather than correcting them.

In addition, the question arises as to how algorithmic predictions could influence the decisions of judges, prosecutors and probation agents. If the algorithm identifies a person as having a high risk of committing homicide, could this lead to different treatment before a crime is committed? Minority Report is really not so far …

The British Ministry of Justice responds

Faced with these turmoil, a representative of the Ministry of Justice took the time to respond to the Guardian, saying: “This project is only carried out for research purposes. It has been designed using the existing data held by the prisons and probation service of His Majesty and the police on offenders condemned to help us better understand the risk that people in probation commit acts of serious violence. A report will be published in due time.“”

Advertising, your content continues below

This declaration is supposed to reassure that the tool is currently used for research purposes only and that it focuses on people already convicted and in probation, rather than the general population. Which is already a good novelle. However, the ministry does not really respond to legitimate concerns about potential biases or the future use of this technology.

AI and application of the law, never a good idea

One might think that it is logical and even healthy that the police are constantly testing solutions to reduce the risk of homicides. However, it would be to forget that in the four corners of the planet, problematic cases have been documented, ranging from the use of AI to create police reports (a bad idea according to experts) to the excessive use of programs like Shotspotter, including the adoption of technologies that threaten the private life of citizens, as in China where facial recognition is queen.

History therefore does not plead in favor of a good implementation of these technologies. Crime prediction systems have often been criticized for their lack of transparency, their propensity to strengthen existing prejudices and their inability to take into account social factors that contribute to crime. But in the United Kingdom, this project does not come out of nowhere and is rather in a more global context where the use of surveillance and prediction technologies by the authorities is the subject of an increasing public debate, in particular with regard to facial recognition and other forms of mass supervision.

Advertising, your content continues below

More Info

We will be happy to hear your thoughts

Leave a reply

Bonplans French
Logo