In a setback for law enforcement, the English Court of Appeal recently ruled in R (Bridges) v Chief Constable of South Wales Police (Respondent) and others that the use of live facial recognition (LFR) technology by South Wales Police breached the European Convention of Human Rights, the Data Protection Act 2018 and the Equality Act 2010.

The use of LFR technology, also known as automated facial recognition technology, gives rise to real privacy concerns. The technology involves the real time automated processing of digital images containing the faces of individuals for the purposes of identification, authentication or verification, or categorisation, of those individuals. Privacy campaigners contend the technology is inaccurate, intrusive and infringes on an individual’s right to privacy. Supporters of the technology argue that it protects the public as it can catch people like terror suspects in a way regular policing cannot.

The London Metropolitan Police, South Wales Police and Leicester Police have been trialling LFR in public since 2015. LFR was used at Download Festival in 2015 and South Wales Police’s first deployment was at the UEFA Champions League Final in June 2017. South Wales Police used the technology to capture faces and match them against a database (or “watch list”) of people wanted by the police and/or the courts, or vulnerable individuals in need of protection. When the system identified a match, the police were presented with both images so they could decide whether to stop and speak to the person. Unmatched faces were deleted straight away and matched images were deleted after 30 days.

The Information Commissioner’s Office (ICO) had ordered an investigation into the trial of LFR by South Wales Police and the Met Police and its use for law enforcement purposes in May 2018. During the ICO investigation, the lawfulness of the use of LFR by South Wales Police was subject to a claim for judicial review. Mr Edward Bridges had been in Cardiff when the LFR system was deployed by South Wales Police. Supported by Liberty, Mr Bridges brought the claim for judicial review even though his image had been deleted automatically as he was not on any watch list.

Although the High Court had dismissed his claim, the Court of Appeal ruled in his favour. In reaching its decision, the Court of Appeal found:

  • Use of the LFR was an unlawful interference with a person’s right to respect for their private and family life in terms of Article 8(1) of the European Convention on Human Rights. The Court ruled that police officers had too much discretion in deciding who appeared on the watch list and where LFR was deployed.
  • The data protection impact assessment undertaken by South Wales Police was inadequate. South Wales Police had failed to assess and mitigate the risks to the rights and freedoms of data subjects as their impact assessment was written on the basis that Article 8 of the ECHR was not infringed.
  • South Wales Police had failed to take reasonable steps to satisfy themselves that the LFR software did not have an unacceptable bias on racial or sex grounds. Although there was no clear evidence of such bias, South Wales Police had failed to comply with their public sector equality duty (PSED) under section 149 of the Equality Act 2010.

The Court of Appeal decision is significant as it is the first UK case to consider the use of LFR technology. Meanwhile, San Francisco and Oakland in California and Somerville in Massachusetts have all banned the use of the technology.

The ICO has issued a statement welcoming the Court of Appeal’s judgment as it provides clarification on, amongst other things, the data protection requirements for deploying LFR technology in public places. The decision recognises that there must be a clear legal framework for the use of such technology. It remains to be seen how South Wales Police and other police forces will react to the decision and what protections will be implemented. It is, however, once again a salutary reminder of the need for the public to have trust and confidence in the deployment of biometric technology such as facial recognition software.

There remain, of course, concerns about potential bias in the algorithms behind LFR technology which may lead to misidentification – and as we have seen with exam results of late, algorithms cannot always be trusted to come up with the correct answer.

If you are considering using technology to make automated decisions about people, please no not hesitate to contact our team of data protection specialists for assistance.