You are here

Real-time surveillance will test the British tolerance for cameras

[CARDIFF, Wales] A few hours before a recent Wales-Ireland rugby match, police officers popped out of a white van, stopped a man and then arrested him. A camera attached to the van had captured his image, and facial recognition technology identified him as someone wanted on suspicion of assault.

The presence of the cameras, and local police's use of the software, is at the center of a debate in Britain that's testing the country's long-standing acceptance of surveillance.

A new generation of cameras is beginning to be used that enable real-time identity checks — raising new concerns among public officials, civil society groups and citizens. Some members of Parliament have called for a moratorium on the use of facial recognition software.

This month, in a case that has been closely watched because there is little legal precedent in the country on the use of facial recognition, a British High Court ruled against a man from Cardiff, the capital of Wales, who sued to end the use of facial recognition by the South Wales Police. The man, Ed Bridges, said the police had violated his privacy and human rights by scanning his face without consent.

sentifi.com

Market voices on:

Critics have said the technology is an intrusion of privacy and has questionable accuracy.

The South Wales Police said the technology was necessary to make up for budget cuts by the central government. "We are having to do more with less," said Alun Michael, South Wales police and crime commissioner. He said the technology was "no different than a police officer standing on the corner looking out for individuals and if he recognizes somebody, saying, ‘I want to talk to you.'"

Police said that since 2017, 58 people had been arrested after being identified by the technology.

Critics also have said there has been a lack of transparency about the technology's use, particularly about the creation of watchlists, which determine which faces a camera system is hunting for.

Sandra Wachter, an associate professor at Oxford University who focuses on technology ethics, said that even if the technology could be proven to identify wanted people accurately, laws were needed to specify when the technology could be used, how watchlists were created and shared, and the length of time images could be stored.

 

NYTIMES