Logo

Policing our privacy – where does the law lie?

David Mitchell analyses the Court of Appeal's recent ruling on the use of facial recognition technology by South Wales Police.

In R (Edward Bridges) v Chief Constable of South Wales [2020] EWCA Civ 1058 the Court of Appeal (Sir Terence Etherton MR, Dame Victoria Sharp PQBD and Singh LJ) allowed the appeal of the civil liberties campaigner, Edward Bridges, against the decision of the Divisional Court which had dismissed his claim for judicial review of South Wales Police Force’s use of live automated facial recognition technology (AFR).

AFR is a means of overt surveillance by which cameras capture images of members of the public which, using the somewhat sinister-sounding, NeoFace Watch software, are then compared with images of persons on a watchlist compiled by the Police for the purpose of identification in any given deployment. Persons on the watchlist might include those wanted on warrants, individuals unlawfully at large, criminal suspects, persons who may be in need of protection (e.g. missing persons), individuals whose presence at a particular event is a cause of concern, persons of possible intelligence interest and / or vulnerable persons.

The appeal succeeded on three grounds concerning non-compliance with each of Article 8, section 64 of the DPA 2018 (data protection impact assessment) and the Public Sector Equality Duty.

The Court of Appeal’s disagreement with the Divisional Court’s Article 8 assessment was the key by which it unlocked the entire judgment below. The Court of Appeal’s focus was on the “in accordance with the law” criterion at Article 8(2). It was the Divisional Court’s view that the combination of primary legislation, secondary legislative instruments in the form of codes of practice issued under primary legislation, and the Police’s own local policies constituted a clear and sufficient legal framework for regulating the Article 8(1) interference.

Having considered the Supreme Court’s treatment of the “in accordance with the law” standard in R (Catt) v Association of Chief Police Officers [2015] AC 1065 and in Re Gallagher [2020] AC 185 at [86] to [89] the Court of Appeal outlined the following four features of AFR which it considered distinguished the present case from, for example, the taking of photographs or use of CCTV: AFR is a novel technology; “it involves the capturing of the images and processing of digital information of a large number of members of the public, in circumstances in which it is accepted that the vast majority of them will be of no interest whatsoever to the police”; it is concerned with “sensitive” personal data within the meaning of the DPA 2018 and this sensitive data is processed in an automated way.

In the Court’s view, the legal framework currently operated by the Police contained two “fundamental deficiencies”; firstly, the question of who was to be placed on a watchlist and secondly, the issue of where was AFR to be deployed: “In relation to both of those questions too much discretion is currently left to individual police officers.” [91]

Whilst the Court considered that the current policies operated by the Police did not sufficiently set out the terms on which discretionary powers could be exercised, it declined “to design a particular set of policies in order for them to comply with the quality of law requirement” [94], albeit noting, unsurprisingly, that there should be consistency between polices operated by different Police forces [118].

Ground 1 was determinative of the appeal. Notwithstanding its finding regarding the quality of law criterion under Article 8(2) the Court went on to find that the interference was proportionate, dismissing the ground 2 challenge. As held by the Court, the effect of AFR on the Appellant on the two occasions about which he complained was “negligible”, as it was for other members of the public who had been subjected to the same surveillance [143].

Ground 1 was also determinative of the ground 3 challenge (unlawful data protection impact assessment under s.64, DPA 2018). In light of its findings regarding non-compliance with Article 8(2), it necessarily followed that the DPIA was defective, proceeding as did, on the assumption that it was Article 8 compliant [153].

The ground 4 challenge (requirement for an appropriate policy document) was dealt with shortly and dismissed on the basis that s.42 DPA 2018 was not enacted at the time of the two deployments about which the Mr Bridges’ complained.

Ground 5 (non-compliance with public sector equality duty) concerned the Appellant’s suggestion (albeit, unproven) that, “there is scientific evidence that facial recognition software can be biased and create a greater risk of false identifications in the case of people from black, Asian and other minority ethnic (“BAME”) backgrounds, and also in the case of women.” [164] The challenge was not put on the basis that AFR was indirectly discriminatory, as a matter of fact, but that the Police were in breach “of the positive duty to have due regard to the need to eliminate such discrimination”. [165]

It was the view of the Court that this positive obligation under s.149 Equality Act 2010 placed a burden on the Police to show that AFR did not contain any possible bias on the grounds of race or sex. [182]

The case is of interest for a number of reasons. Firstly, it is a further example of how the law of privacy under Article 8 is increasingly regulated by the framework of data protection law. However, the two remain distinct. As noted by the Court at [104], “the legal protections in the DPA 2018 form an important part of the framework in determining whether the interference with the Appellant’s Article 8 rights was in accordance with the law. That Act is not, however, sufficient by itself, nor was it suggested that it is".

Secondly, and relatedly, in its assessment of Article 8 the Court’s exercise remained an essentially subjective one. On the same facts (including the four factors considered as material by the Court of Appeal) the Divisional Court reached the opposite conclusion regarding Article 8(2).

Thirdly, whilst the appeal succeeded it is unlikely to slow the use of AFR by the Police either within South Wales or beyond (not that this was the purpose of the challenge). The respondent has stated that it will not be applying for leave to take the matter to Supreme Court. Subject to policy amendment, AFR is here to stay. Rather, within the broader context of the law’s attempt to keep pace with technology the judgment serves to confirm that the principal bulwark of individual privacy remains Article 8 combined with data protection law. As recently noted in the dissenting judgment of the Grand Chamber in Lopez Ribalda v Spain (1874/13), concerning covert workplace surveillance,

This case demonstrates the growing influence and control that technology has in our world, and more particularly, the collection and use of our personal data in our everyday activities. As a living instrument, the Convention, and therefore the Court, not only needs to recognise the influence of modern technologies, but also has to develop more adequate legal safeguards to secure respect for the private life of individuals.”

In this wider conflict Mr Bridges’ success before the Court of Appeal, if not a categoric victory for respect for private life, at least serves to temper the zeal with which new surveillance and data collection technologies are adopted.

David Mitchell is a barrister at 39 Essex Chambers. He can be contacted This email address is being protected from spambots. You need JavaScript enabled to view it. or by telephone on 020 7832 1111.

(c) HB Editorial Services Ltd 2009-2022