13:45, February 04 253 0 theguardian.com

2019-02-04 13:45:06
The Guardian view on crime and algorithms: big data makes bigger problems

Some of the best, or at least sharpest, minds on the planet are devoted to guessing what we might buy next, and showing us advertisements for it. Often the results are ludicrously inaccurate; sometimes they are creepily precise. Would we trust the same kind of technology to predict what crimes we might next commit? That is the question raised by the latest report by campaigners at Liberty on the implications of the police’s use of big data and machine learning, the technologies usually referred to as artificial intelligence. When they’re used to sell us things, they are relatively harmless. When they sell opinions, they can corrupt democracy. When they determine the course of the criminal justice system, they could do immense damage. Because machine learning can only detect patterns in the data that it is given, any bias in the original sample will only be amplified. So if past practice has been to discriminate against women or minorities, any algorithm fed on previous experience will continue this pattern, but this time with the apparent authority of science behind it. And because modern machine learning techniques are opaque, even to their programmers, a computer cannot easily be made to testify about its own reasoning in the way that police officers can – in theory – be tested by judges or politicians.

The civil liberties group says it found at least 14 police forces in England and Wales are using or have used software to predict crimes in particular areas; three are attempting to use the same technology on individuals in order to predict their likelihood of reoffending. The problems they are trying to solve can’t just be wished away. It makes obvious sense to concentrate police resources where they are most needed, and this is especially true after the swingeing cuts imposed by Tory governments. But it is easy to see that the use of such software can perpetuate and entrench patterns of unjust discrimination. Because crimes are detected more often where there are police to detect them, the areas in which police are concentrated will tend to have higher recorded crime rates, which in turn suggests that they need more police sent to them, and so on. Only constant human attention will keep the technology focused where it is useful.

Society does have a vital interest in being able to predict who is most likely to offend or to reoffend, and to help them away from temptation. But the idea that algorithms could substitute for probation officers or the traditional human intelligence of police officers is absurd and wrong. Of course such human judgments are fallible and sometimes biased. But training an algorithm on the results of previous mistakes merely means they can be made without human intervention in the future. The strongest single predictor of whether a young man will end up in jail is whether his father did so. People live down to society’s expectations, and we all lose as a result.

The more data is collected and shared through the state’s various systems – not just the police, but schools, the welfare system and even, increasingly, the NHS – the harder it will be for anyone to escape such stereotyping. The risk is a loss of liberty with no corresponding gain in efficiency. The underlying problem is that successive governments have mounted an unprecedented assault on the entire criminal justice system, whether courts, legal aid, the probation service or the police. The damage this has done cannot be repaired by technology alone. Machines can make human misjudgments very much worse. They cannot be trusted to eliminate human responsibility.

Topics