Crea sito

London’s Met Police switches on live facial recognition, flying in face of human rights concerns

While EU lawmakers are mulling a temporary ban on the use of facial recognition to safeguard individuals’ rights, as part of risk-focused plan to regulate AI, London’s Met Police has today forged ahead with deploying the privacy hostile technology — flipping the switch on operational use of live facial recognition in the UK capital.
The deployment comes after a multi-year period of trials by the Met and police in South Wales.
The Met says its use of the controversial technology will be targeted to “specific locations… where intelligence suggests we are most likely to locate serious offenders”.
“Each deployment will have a bespoke ‘watch list’, made up of images of wanted individuals, predominantly those wanted for serious and violent offences,” it adds.
It also claims cameras will be “clearly signposted”, adding that officers will be “deployed to the operation will hand out leaflets about the activity”.
“At a deployment, cameras will be focused on a small, targeted area to scan passers-by,” it writes. “The technology, which is a standalone system, is not linked to any other imaging system, such as CCTV, body worn video or ANPR.”
The biometric system is being provided to the Met by Japanese IT and electronics giant, NEC.
In a press statement, assistant commissioner Nick Ephgrave claimed the force is taking a balanced approach to using the controversial tech.
“We all want to live and work in a city which is safe: the public rightly expect us to use widely available technology to stop criminals. Equally I have to be sure that we have the right safeguards and transparency in place to ensure that we protect people’s privacy and human rights. I believe our careful and considered deployment of live facial recognition strikes that balance,” he said.
London has seen a rise in violent crime in recent years, with murder rates hitting a ten-year peak last year.
The surge in violent crime has been linked to cuts to policing services — although the new Conservative government has pledged to reverse cuts enacted by earlier Tory administrations.
The Met says its hope for the AI-powered tech is will help it tackle serious crime, including serious violence, gun and knife crime, child sexual exploitation and “help protect the vulnerable”.
However its phrasing is not a little ironic, given that facial recognition systems can be prone to racial bias, for example, owing to factors such as bias in data-sets used to train AI algorithms.
So in fact there’s a risk that police-use of facial recognition could further harm vulnerable groups who already face a disproportionate risk of inequality and discrimination.
Yet the Met’s PR doesn’t mention the risk of the AI tech automating bias.
Instead it makes pains to couch the technology as “additional tool” to assist its officers.
“This is not a case of technology taking over from traditional policing; this is a system which simply gives police officers a ‘prompt’, suggesting “that person over there may be the person you’re looking for”, it is always the decision of an officer whether or not to engage with someone,” it adds.
While the use of a new tech tool may start with small deployments, as is being touting here, the history of software development underlines how potential to scale is readily baked in.
A ‘targeted’ small-scale launch also prepares the ground for London’s police force to push for wider public acceptance of a highly controversial and rights-hostile technology via a gradual building out process. Aka surveillance creep.
On the flip side, the text of the draft of an EU proposal for regulating AI which leaked last week — floating the idea of a temporary ban on facial recognition in public places — noted that a ban would “safeguard the rights of individuals”. Although it’s not yet clear whether the Commission will favor such a blanket measure, even temporarily.
UK rights groups have reacted with alarm to the Met’s decision to ignore concerns about facial recognition.
Liberty accused the force of ignoring the conclusion of a report it commissioned during an earlier trial of the tech — which it says concluded the Met had failed to consider human rights impacts.
It also suggested such use would not meet key legal requirements.
“Human rights law requires that any interference with individuals’ rights be in accordance with the law, pursue a legitimate aim, and be ‘necessary in a democratic society’,” the report notes, suggesting the Met earlier trials of facial recognition tech “would be held unlawful if challenged before the courts”.

When the Met trialled #FacialRecognition tech, it commissioned an independent review of its use.
Its conclusions:
The Met failed to consider the human rights impact of the techIts use was unlikely to pass the key legal test of being “necessary in a democratic society”
— Liberty (@libertyhq) January 24, 2020

A petition set up by Liberty to demand a stop to facial recognition in public places has passed 21,000 signatures.
Discussing the legal framework around facial recognition and law enforcement last week, Dr Michael Veale, a lecturer in digital rights and regulation at UCL, told us that in his view the EU’s data protection framework, GDPR, forbids facial recognition by private companies “in a surveillance context without member states actively legislating an exemption into the law using their powers to derogate”.
A UK man who challenged a Welsh police force’s trial of facial recognition has a pending appeal after losing the first round of a human rights challenge. Although in that case the challenge pertains to police use of the tech — rather than, as in the Met’s case, a private company (NEC) providing the service to the police.

UK High Court rejects human rights challenge to bulk snooping powers

Civil liberties campaign group Liberty has lost its latest challenge to controversial UK surveillance powers that allow state agencies to intercept and retain data in bulk.
The challenge fixed on the presence of so-called ‘bulk’ powers in the 2016 Investigatory Powers Act (IPA): A controversial capability that allows intelligence agencies to legally collect and retain large amounts of data, instead of having to operate via targeted intercepts.
The law even allows for state agents to hack into devices en masse, without per-device grounds for individual suspicion.
Liberty, which was supported in the legal action by the National Union of Journalists, argued that bulk powers are incompatible with European human rights law on the grounds that the IPA contains insufficient safeguards against abuse of these powers.
Two months ago it published examples of what it described as shocking failures by UK state agencies — such as not observing the timely destruction of material; and data being discovered to have been copied and stored in “ungoverned spaces” without the necessary controls — which it said showed MI5 had failed to comply with safeguards requirements since the IPA came into effect.
However the judges disagreed that the examples of serious flaws in spy agency MI5’s “handling procedures” — which the documents also show triggering intervention by the Investigatory Powers Commissioner — sum to a conclusion that the Act itself is incompatible with human rights law.
Rejecting the argument in their July 29 ruling they found that oversight mechanisms the government baked into the legislation (such as the creation of the office of the Investigatory Powers Commissioner to conduct independent oversight of spy agencies’ use of the powers) provide sufficient checks on the risk of abuse, dubbing the regime as “a suite of inter-locking safeguards”.
Liberty expressed disappointment with the ruling — and has said it will appeal.
In a statement the group told the BBC: “This disappointing judgment allows the government to continue to spy on every one of us, violating our rights to privacy and free expression.
“We will challenge this judgment in the courts, and keep fighting for a targeted surveillance regime that respects our rights. These bulk surveillance powers allow the state to Hoover up the messages, calls and web history of hordes of ordinary people who are not suspected of any wrongdoing.”
This is just one of several challenges brought against the IPA.
A separate challenge to bulk collection was lodged by Liberty, Big Brother Watch and others with the European Court of Human Rights (ECHR).
A hearing took place two years ago and the court subsequently found that the UK’s historical regime of bulk interception had violated human rights law. However it did not rule against bulk surveillance powers in principle — which the UK judges note in their judgement, writing that consequently: “There is no requirement for there to be reasonable grounds for suspicion in the case of any individual.”
Earlier this year Liberty et al were granted leave to appeal their case to the ECHR’s highest court. That case is still pending before the Grand Chamber.

WP Twitter Auto Publish Powered By : XYZScripts.com