Use of facial recognition technology was unlawful but did not contravene article 8 finds Court of Appeal

The Court of Appeal has held that the use of facial recognition technology by the South Wales Police Force (SWP) was unlawful but that its use was proportionate under article 8 of the European Convention on Human Rights (ECHR).

The decision in R (Bridges) -v- CC South Wales and others followed an appeal by civil liberties campaigner, Edward Bridges, against the decision of the Divisional Court in September 2019 not to grant a judicial review of the force’s use of automated facial recognition technology (AFR) in a pilot project.

Bridges claimed the use of the technology was not compatible with the right to respect for private life under Article 8 of the European Convention on Human Rights, data protection legislation, and the Public Sector Equality Duty under section 149 of the Equality Act 2010.

Automated facial recognition technology of the kind used by SWP works by extracting faces captured in a live feed from a camera and automatically comparing them to faces on a watchlist. The technology has been deployed in public about 50 times between May 2017 and April 2019 by the police in south Wales.

If a match is detected, the technology produces an alert and the person responsible for the technology, usually a police officer, will review the images to determine whether to intervene. If no match is detected, the software automatically deletes the facial image captured from the live feed.

To date, SWP watchlists have comprised between 400-800 people. Of the 50 deployments between 2017 and 2019 around 500,000 faces are estimated to have been scanned. The overwhelming majority of those faces scanned were of persons not on a watchlist and were therefore automatically deleted.

Mr Bridges – who lives in Cardiff – claimed that given his proximity to the cameras he was one of the estimated 500,000 people scanned. Even though he was not on a watchlist and his image was most likely deleted, Mr Bridges brought a claim for judicial review.

In that hearing, the Divisional Court (DC) found that although the right to privacy under Article 8 of the Convention was engaged, the interference with rights was in accordance with law and proportionate.

The DC dismissed both data protection claims, brought under the Data Protection Act 1998 and Data Protection Act 2018 (“DPA 2018”). Finally, the DC rejected Mr Bridges’ argument that SWP breached the PSED by not considering the possibility that AFR Locate might produce results that were indirectly discriminatory on the grounds of sex and/or race because it produces a higher rate of positive matches for female faces and/or for black and minority ethnic faces. The DC held that the PSED was not breached because there was no suggestion in April 2017 when the AFR Locate trial commenced that the software might operate in a way that was indirectly discriminatory.

At the Court of Appeal, Mr Bridges challenged the initial findings of the Divisional Court on the following grounds:

Ground 1: The Divisional Court erred in concluding that the interference with the Appellant's rights under Article 8(1) of the Convention, taken with section 6 of the HRA 1998, occasioned by SWP's use of AFR on 21 December 2017 and 27 March 2018 and on an ongoing basis, was/is in accordance with the law for the purposes of Article 8(2).

Ground 2: The Divisional Court made an error of law in assessing whether SWP's use of AFR at the December 2017 and March 2018 deployments constituted a proportionate interference with Article 8 rights within Article 8(2). The Divisional Court failed to consider the cumulative interference with the Article 8 rights of all those whose facial biometrics were captured as part of those deployments.

Ground 3: The Divisional Court was wrong to hold that SWP's DPIA complied with the requirements of section 64 of the DPA 2018.

Ground 4: The Divisional Court erred in declining to reach a conclusion as to whether SWP has in place an "appropriate policy document" within the meaning of section 42 of the DPA 2018 (taken with section 35(5) of the DPA 2018), which complies with the requirements of that section. Having in place such a document is a condition precedent for compliance with the first data protection principle (lawful and fair processing) contained in section 35 of the DPA 2018 where the processing of personal data constitutes "sensitive processing" within the meaning of section 35(8) of the DPA.

Ground 5: The Divisional Court was wrong to hold that SWP complied with the PSED in circumstances in which SWP's Equality Impact Assessment was obviously inadequate and was based on an error of law (failing to recognise the risk of indirect discrimination) and SWP's subsequent approach to assessing possible indirect discrimination arising from the use of AFR is flawed. It is argued that the Divisional Court failed in its reasoning to appreciate that the PSED is a continuing duty.

The appeal was allowed on Grounds 1, 3 and 5. But grounds 2 and 4 were rejected.

On Ground 1, The court said "too much discretion is currently left to individual police officers. It is not clear who can be placed on the watchlist nor is it clear that there are any criteria for determining where AFR can be deployed."

The lack of guidance on where the technology can be located and who could be put on a watchlist are "two critical defects in the current legal framework", the court said.

On ground 2, the Court of Appeal found that SWP’s actions constituted a proportionate interference with the claimant’s Article 8 rights. It held that the DC had correctly conducted a weighing exercise with one side being the actual and anticipated benefits of AFR Locate and the other side being the impact of AFR deployment on Mr Bridges. The benefits were potentially great, and the impact on Mr Bridges was minor, and so the use of AFR was proportionate under Article 8(2).

On Ground 3, the court found that the Divisional Court was wrong to hold that SWP provided an adequate “data protection impact assessment” (“DPIA”) as required by section 64 of the DPA 2018. The court found that, as the DPIA was written on the basis that Article 8 was not infringed, the DPIA was deficient.

On Ground 4, the court found that the Divisional Court was wrong to not reach a conclusion as to whether SWP had in place an “appropriate policy document” within the meaning of section 42 DPA 2018. The court held that the Divisional Court was right to not reach a conclusion on this point because it did not need to be decided. The two specific deployments of AFR Locate which were the basis of Mr Bridges’ claim occurred before the DPA 2018 came into force.

On Ground 5, the court found that the Divisional Court was wrong to hold that SWP complied with the PSED. The court held that the purpose of the PSED was to ensure that public authorities give thought to whether a policy will have a discriminatory potential impact. SWP erred by not taking reasonable steps to make enquiries about whether the AFR Locate software had bias on racial or sex grounds. The court did note, however, that there was no clear evidence that AFR Locate software was in fact biased on the grounds of race and/or sex.

Concluding the decision, Sir Terence Etherton MR, Dame Victoria Sharp PQBD and Lord Justice Singh said: “As to the appropriate remedy, we consider that declaratory relief to reflect the reasons why this appeal has succeeded will suffice. In the circumstances which have arisen, the parties agree that the only remedy which is required is a declaration but they have not been able to agree the precise terms of a declaration. Having considered the rival contentions, we have concluded that the declaration proposed by SWP more accurately reflects the judgment of this court. We will grant a declaration in the following terms:

“The Respondent's use of Live Automated Facial Recognition technology on 21 December 2017 and 27 March 2018 and on an ongoing basis, which engaged Article 8(1) of the European Convention on Human Rights, was not in accordance with the law for the purposes of Article 8(2).

“As a consequence of the declaration set out in paragraph 1 above, in respect of the Respondent's ongoing use of Live Automated Facial Recognition technology, its Data Protection Impact Assessment did not comply with section 64(3)(b) and (c) of the Data Protection Act 2018.

“The Respondent did not comply with the Public Sector Equality Duty in section 149 of the Equality Act 2010 prior to or in the course of its use of Live Automated Facial Recognition technology on 21 December 2017 and 27 March 2018 and on an ongoing basis.”

South Wales Police sad that “it could work with the outcome” of the decision and did not intend to appeal.

The Surveillance Camera Commissioner for England and Wales, Tony Porter, told the BBC that he hopes the Home Office will use the decision to update a "woefully" out-of-date code of practice used to regulate facial recognition and other surveillance efforts.

Adam Carey