Predictive Policing

Home » Blog » Predictive Policing

Robot judges in Estonia? American AI sentencing criminals to prison? Computers predicting crimes before they even happen? One may be forgiven for thinking such concepts come straight from a science fiction novel, or the rabid rantings of an online conspiracy theorist. The truth is they are all part of today’s reality. And it looks like it’s only a matter of time before concepts like predictive policing and artificial intelligence will be an everyday feature of justice systems worldwide.

Robot judges in Estonia? American AI sentencing criminals to prison? Computers predicting crimes before they even happen? One may be forgiven for thinking such concepts come straight from a science fiction novel, or the rabid rantings of an online conspiracy theorist. The truth is they are all part of today’s reality. And it looks like it’s only a matter of time before concepts like predictive policing and artificial intelligence will be an everyday feature of justice systems worldwide.

In 2017, computer scientists at the University College London developed an AI judge, said to be capable of weighing up evidence, arguments and weighty dilemmas of right and wrong with 79% accuracy (compared to cases decided in the old-fashioned way). At that time, such AI judges were strictly the stuff of academic research. But in 2018, the Estonian government announced it had been testing and developing a pilot program to have artificially intelligent judges issue actual court rulings. The program has small debt disputes adjudicated by AI, subject to a subsequent appeal to an actual real-life human being judge. Nowadays, in many American criminal courts, it has become almost routine practice for AI systems to recommend sentencing and other guidelines, which are used by judges to reach their decisions. These “risk assessment tools” look at a defendant’s profile (including race, age, gender, and where they live), to deliver a recidivism score, a numerical estimate of how likely it is that person will re-offend. The judge may use this score, based on those criteria, to determine whether a person should be granted bail before trial and, if convicted, how long their sentence should be.

And computerised “justice” doesn’t just stop there.

In recent times, countries like Australia, Britain and the USA have adopted into standard policing procedures the concept of “predictive policing,” in which vast amounts of data are pumped into specialised programs which utilise algorithms to calculate and predict where police departments should allocate their resources, and at what times. Some such systems use facial recognition software to help identify “potential suspects” based on gender, age, race, history and economic circumstances. So if a white male aged between 18 and 25, from poor socio-economic circumstances, and historically known for criminal behaviour, is in a particular area, they will be flagged as a suspect. Japan is reportedly looking at this type of predictive system as it cranks up for 2020 Tokyo Olympics, in the hope of targeting criminal suspects, even before they actually do anything wrong.

Undoubtedly, such computer profiling, which processes data based on history, suspicion and presumption, is absolutely cutting-edge, and very clever. But is it just? Or does it impermissibly discriminate against the marginalised and disadvantaged in our community?

Last year, NSW Police identified around 400 children as requiring “pro-active attention” through the Suspect Target Management Program, a type of predictive policing software. But, while just 5.6% of children in New South Wales are aboriginal, 51.5% of the 400 young people targeted by the STMP program were indigenous Australians. Once on the list all of them were likely to be routinely stopped and questioned by police, detained, and even visited at their home, perhaps on multiple occasions, and not necessarily for any emergent reason. The result has raised concerns such systems, which can only spit out results based on data we put into them, are actually entrenching prejudice, racism and discrimination in society, and thereby further disadvantaging the already-disadvantaged.

There is no doubt artificial intelligence is an invaluable tool to further human achievement in all fields, including justice. The danger is that we allow it to also advance, enhance and ultimately entrench our biases.

Natasha Dawson

Queensland Criminal Lawyer

Share the post

Recent Posts

Categories

Categories

More To Read

For a variety of technological, societal and other reasons, many couples have embraced the use of assisted reproductive treatment (‘ART’) as an integral element in the child-birth process.  During the past five years alone, ART procedures conducted in Australia have increased by 10% on average.
At present, there is no statute law in Australia that provides a right of publicity. However, that doesn’t mean one can invariably use the likeness or image of another without permission or compensation.
In a time of rising construction costs, plummeting home building activity and increasing demand, the Housing Availability and Affordability (Planning and Other Legislation Amendment) Act 2024 was passed through Parliament in April 2024, amending the Planning Act 2016 to hopefully streamline and facilitate affordable residential development throughout the state.

Recent Posts

Categories

Contact us and see
how we can help

Whether your matter is civil, criminal or commercial in nature, our team at Nyst Legal has all the experience, expertise and diligence necessary to ensure that you achieve the absolute best available result.