07:31, October 22 55 0 abajournal.com

2019-10-22 07:31:06
As machines play a greater role in criminal justice, third-party auditing and oversight is essential

Jason Tashea

Jason Tashea. Photo by Saverio Truglia.

In August, the Justice Ministry of Denmark announced it would review more than 10,000 criminal convictions because of a software error.

At issue was a technology that converts cellphone carriers’ geo-location data into evidence used for prosecution. During that automated process, accuracy was unknowingly lost but still used in criminal investigations and prosecutions. Compounding this problem, some of the location data linked cellphones to the wrong cell towers, putting innocent people at crime scenes.

While the conversion issue was fixed in March, authorities will now assess 10,700 convictions dating back to 2012, according to coverage from the New York Times. At the same time, Danish prosecutors have put a two-month moratorium on the use of any cell geo-location data in criminal cases.

“On the basis of the new serious information, the Attorney General has decided to pull the handbrake in order that telecommunications data temporarily may not be used in court as evidence that the defendant is guilty or as the basis for pre-trial detention,” said Danish Justice Minister Nick Haekkerup in a statement, as translated by Martin Clausen, ex-general counsel, legal tech thought leader and CEO of Syngrato in Copenhagen.

The minister added that the experience “shakes our confidence in the justice system.”

As scalable technology plays a bigger role in the investigation, arrest and prosecution of people, mass conviction reviews will be more common. This creates a hidden but substantial human and monetary cost to hardware and software adoption in the criminal justice system.

Knowing that this is increasingly the new normal, police, prosecutors and courts must assume errors will occur and regularly have their technology and data systems audited by third parties. Doing so will improve faith in the criminal justice system, support the adoption of trustworthy technology that assists public safety and open trials, and avoid the human and monetary cost of mass post-conviction review.

In part, audits are needed because the criminal justice system’s adversarial nature is insufficient to ferret out systemically bad evidence.

While the justice system is replete with examples, perhaps none is more egregious than the scandal the rocked a Massachusetts crime lab earlier this decade.

From 2003 to 2012, a lab analyst falsified thousands of results and tampered with evidence. While an investigation concluded that the analyst’s corner-cutting and falsifications worried co-workers at the time, those worries went unheeded by supervisors.

In most jurisdictions in the U.S., defense counsel (oftentimes, public defenders) usually do not have access to a county’s crime lab, and that was no different in this case. Compounding matters, this analyst’s outputs were utilized by seven counties. With this deficit, the results from the state crime lab were often taken on good faith and most defendants pled out before the analyst’s illegal actions came to light, according to a 2017 NBC News report.

Even after the analyst’s crimes became known in 2012, prosecutors wanted to preserve the convictions and put the onus on individual defendants to re-open their cases. After a five-year legal battle that went to the state’s supreme court, the tainted convictions were tossed.

In total, 21,587 cases ending in conviction were overturned, while about 320 were left standing or were retried, according to ProPublica.

Not just a forensics issue, these problems can arise with routine technology that courts have trusted for decades.

Just last year, the New Jersey Supreme Court threw into question nearly 21,000 DUI convictions because an officer in charge of calibrating the machines used in five counties failed to do so properly. Now with the breathalyzer results inadmissible, a panel of four judges is reviewing the cases to determine the fate of the standing convictions.

While it may be easy to pass off both the Massachusetts and New Jersey examples as the fault of human error or criminal negligence, that isn’t the point. Tens of thousands of people were deprived of due process and liberty because the actions of humans went unchecked for years. Meanwhile, the adversarial process, which only focuses on a single case or defendant at a time, was insufficient to discover and a rectify a systemic error

In both instances, the single point of failure could have been caught if outside auditing of breathalyzers and drug analysis facilities was required or followed, as opposed to relying on the state to monitor its own technology and processes.

Currently, larger prosecutors’ offices are creating conviction integrity units to investigate fact-based—as opposed to legal-based—wrongful convictions. These units are laudable and should be replicated, however, by their nature, they can only be useful after a person has been convicted.

Knowing that proactive and preventative solutions are needed, however, little is changing as we move from the era of rogue actors to rogue software. While it may be worth considering the possibility of a malicious actor, like a programmer or hacker, affecting these tools, it’s unnecessary. Software has more than enough unintentional errors to keep the justice system busy.

Some errors will be harmless, but others will ripple through the criminal justice system as police, prosecutors and courts become more reliant on third-party software and hardware, much of which is obscured from defendants and the public.

Week News

Month News

Year News