13:07, June 14 31 0 theguardian.com

2018-06-14 13:07:22
Police face legal action over use of facial recognition cameras

Two legal challenges have been launched against police forces in south Wales and London over their use of automated facial recognition (AFR) technology on the grounds the surveillance is unregulated and violates privacy.

The claims are backed by the human rights organisations Liberty and Big Brother Watch following complaints about biometric checks at the Notting Hill carnival, on Remembrance Sunday, at demonstrations and in high streets.

Liberty is supporting Ed Bridges, a Cardiff resident, who has written to the chief constable of South Wales police alleging he was tracked at a peaceful anti-arms protest and while out shopping.

Big Brother Watch is working with the Green party peer Jenny Jones who has written to the home secretary, Sajid Javid, and the Metropolitan police commissioner, Cressida Dick, urging them to halt deployment of the “dangerously authoritarian” technology.

If the forces do not stop using AFR systems then legal action will follow in the high court, the letters said. Money for the challenges is being raised through a crowdfunding site.

According to Liberty, South Wales police have used facial recognition technology in public spaces at least 20 times since May 2017. On one occasion – at the 2017 Champions League final in Cardiff – the technology was later found to have wrongly identified more than 2,200 people as possible criminals.

At the time a force spokesman said: “Technical issues are common to all face recognition systems, which means false positives will be an issue as the technology develops ... The accuracy of the system used by South Wales police has continued to improve.”

Bridges claimed he was monitored on a shopping street in Cardiff last December, and again while protesting outside the Cardiff Arms Fair in March. He said: “Indiscriminately scanning everyone going about their daily business makes our privacy rights meaningless. The inevitable conclusion is that people will change their behaviour or feel scared to protest or express themselves freely – in short, we’ll be less free.

“The police have used this intrusive technology throughout Cardiff with no warning, no explanation of how it works and no opportunity for us to consent. They’ve used it on protesters and on shoppers.”

Corey Stoughton, Liberty’s advocacy director, said: “The police’s creeping rollout of facial recognition into our streets and public spaces is a poisonous cocktail – it shows a disregard for democratic scrutiny, an indifference to discrimination and a rejection of the public’s fundamental rights to privacy and free expression.”

Liberty has argued that AFR systems capture peoples’ biometric data without their consent, disproportionately misidentifies female and non-white faces and breaches data protection laws.

Lady Jones expressed fears that she could end up on a facial recognition watch list when conducting her parliamentary and political duties. Details about her were held on the Met’s domestic extremism intelligence unit.

Big Brother Watch’s director, Silkie Carlo, said: “Facial recognition cameras are dangerously authoritarian, hopelessly inaccurate and risk turning members of the public into walking ID cards. The prospect of facial recognition turning those CCTV cameras into identity checkpoints like China is utterly terrifying.”

Jones said: “I’m extremely concerned about the impact that the Met police’s use of automated facial recognition will have on my ability to carry out my democratic functions. Police use of this technology has no legal basis, and infringes people’s rights and civil liberties. That’s why I’m challenging the Met to end its use, now.”

Anna Dews, a solicitor at the law firm Leigh Day, who is representing Big Brother Watch and Jones, said: “The lack of a statutory regime or code of practice regulating this technology, the uncertainty as to when and where automated facial recognition can be used, the absence of public information and rights of review, and the use of custody images unlawfully held, all indicate that the use of automated facial recognition, and the retention of data as a result, is unlawful and must be stopped as a matter of priority.”

The Home Office has been approached for comment.

Topics