Facial recognition technology is not coming … it's already here. 2018 is fast becoming the year that facial recognition technology finally hits the mainstream with a constant torrent of stories revealing the growing use of these systems by law enforcement agencies around the world. A resident of Cardiff in the UK is now questioning whether the technology violates privacy rights by suggesting that if the country's police forces do not cease using these systems he, and others, will commence high court legal actions.
Cardiff resident Ed Bridges, with the backing of Liberty, a UK-based human rights group, is directly targeting the use of automated facial recognition technology by South Wales Police over the past few years. Bridges claims he was unreasonably tracked by this technology on several occasions, most recently in March at a peaceful protest outside the Cardiff Arms Fair.
The police reportedly utilized the technology outside the main entrance of the event, potentially scanning Bridges and numerous others without their knowledge. Several UK police forces have been using the technology since 2015 and the police allege all images of passers-by are kept for only 31 days before being erased.
"The police have used this intrusive technology throughout Cardiff with no warning, no explanation of how it works and no opportunity for us to consent," says Bridges in his complaint. "They've used it on protesters and on shoppers. This sort of dystopian policing has no place in our city or any other."
The case raises four issues of objection in relation to law enforcement's use of facial recognition technology in public spaces. It claims it violates a person's right to privacy, interferes with freedom of expression, discriminates against minorities due to inaccuracies in the technology, and it breaches data protection laws.
Another potential legal action in the UK against law enforcement usage of facial recognition systems is coming from Big Brother Watch, a civil liberties group working in association with Green party peer, Jenny Jones. A solicitor representing Jones and Big Brother Watch sums up the concerns saying, "The lack of a statutory regime or code of practice regulating this technology, the uncertainty as to when and where automated facial recognition can be used, the absence of public information and rights of review, and the use of custody images unlawfully held, all indicate that the use of automated facial recognition, and the retention of data as a result, is unlawful and must be stopped as a matter of priority."
The rising concern of facial recognition technologies is not limited to the United Kingdom. Back in May, it was revealed that a chapter of the American Civil Liberties Union (ACLU) obtained a series of emails revealing how Amazon was selling a facial recognition system to law enforcement agencies across the country.
Called Rekognition, the technology can reportedly track large volumes of faces in crowds in real time, identifying up to 100 different facial targets from a single image. The ACLU documents revealed how Amazon were actively working with several government agencies to deploy the technology.
Unsurprisingly, the public response to the ACLU documents was filled with outrage. Civil libertarians voiced questions over how the technology could be misused, while several Democratic representatives highlighted concerns regarding the inherent bias in the algorithms. All the issues fell back on the point that this technology was being rapidly deployed with no oversight or regulation.
While China may be racing ahead in incorporating facial recognition systems broadly across all sectors of its society, more democratic nations are rightly asking some important questions that are yet to be answered. Is the deployment of a mass facial recognition system in a public space an invasion of personal privacy?
After all, it could be argued that tracking a person's movements in public spaces is akin to the harvesting of metadata on a personal smart device. It has already been established that the gathering of metadata without a legal warrant is allowed. Would personalized facial recognition data captured in a public space be considered social metadata? And even more, what right do we have to this kind of privacy in a public space?
Notwithstanding the entirely valid questions over accuracy and racial bias in these facial recognition systems – a vitally important issue that fundamentally needs resolution before broad deployment of the technology is justifiable – these legal questions currently arising in both the UK and US do certainly act as a reminder that technology moves faster than government regulation. And while these systems will inevitably be deployed more and more in the future, it is vital that at the very least there is a transparent conversation occurring around what kind of oversight is necessary and at what point personal privacy is being breached.
Source: Liberty Human Rights