Automated facial recognition, privacy and data protection law

The Divisional Court has found that police use of automated facial recognition was a justified privacy intrusion. Robin Hopkins sets out the key findings in a landmark case.

The opening sentence of the judgment in R (Bridges) v Chief Constable of South Wales Police and Others [2019] EWHC 2341 (Admin) is right up Panopticon’s * alley: “The algorithms of the law must keep pace with new and emerging technologies”. In precisely that spirit, the Divisional Court’s (Haddon-Cave LJ and Swift J) dismissal of the challenge to South Wales Police’s use of Automated Facial Recognition technology (“AFR”) contains very significant lessons in how to apply privacy and data protection law to beneficial but intrusive technology.

South Wales Police’s “AFR Locate” pilot project involves using surveillance cameras to capture digital images of members of the public, in particular at locations and events susceptible to criminality. Cameras are positioned so as to maximise coverage at these events, and they are capable of capturing up to 50 faces per second. A biometric data analysis is then performed on every captured face; this data is run against a database of wanted persons, categorised into various lists depending on how much of a baddie each wanted person is. If there is no match, the biometric data about the individual is immediately deleted (with the underlying CCTV footage retained for 31 days as normal). If there is a match, the police decide what action to take.

Mr Bridges is not on a wanted list, and it was accepted that he was probably subject to the AFR process outlined above on a couple of occasions. He brought judicial review proceedings on the point of principle that this process is unlawfully intrusive. For our purposes (i.e. leaving aside a challenge concerning the Public Sector Equality Duty), his grounds of claim were (1) contraventions of his rights under Article 8 ECHR, and (2) contraventions of data protection legislation (i.e. the DPA 1998 and then the DPA 2018, and in particular Part 3, that implements the Law Enforcement Directive). The Court’s analysis under both headings – and the interplay between the two headings – is detailed and instructive.

Article 8 ECHR

First question: was there an interference with the claimant’s rights under Article 8(1)? Answer: yes. Key conclusions emerging from the Court’s analysis of the case law on photographs, fingerprints and DNA in particular include these:

  • AFR is significantly intrusive: it “goes much further than the simple taking of a photograph. The digital information that comprises the image is analysed and the biometric facial data is extracted. That information is then further processed when it is compared to the watchlist information. The fact that this happens when the Claimant is in a public space is not a sufficient response” (para 54);
  • Biometric data is inherently private: “Like fingerprints and DNA, AFR technology enables the extraction of unique information and identifiers about an individual allowing his or her identification with precision in a wide range of circumstances. Taken alone or together with other recorded metadata, AFR-derived biometric data is an important source of personal information… it is information of an “intrinsically private” character. The fact that the biometric data is derived from a person’s facial features that are “manifest in public” does not detract from this. The unique whorls and ridges on a person’s fingertips are observable to the naked eye. But this does not render a fingerprint any the less a unique and precise identifier of an individual. The facial biometric identifiers too, are precise and unique” (para 57);
  • The speed of erasure does not preclude an Article 8 interference: “The application of Article 8 is not dependent on the long-term retention of biometric data. It is sufficient if biometric data is captured, stored and processed, even momentarily. The mere storing of biometric data is enough to trigger Article 8 and the subsequent use (or discarding) of the stored information has no bearing…” (para 59).

Second question: was this interference “in accordance with the law”? Answer: yes. Key points:

  • The police’s use of AFR was not ultra vires: its common law powers to prevent and detect crime sufficed.
  • While there is as yet no bespoke legislation governing AFR use, the applicable legal framework satisfies the “in accordance with the law” tests (summarised at para 80). The components of that framework are: primary legislation (the DPA 2018 and Law Enforcement Directive, mirrored in mirrored in the Code of Practice on the Management of Police Information); secondary legislative instruments (the Surveillance Camera Code of Practice) and South Wales Police’s own policies.
  • The Court did acknowledge, however, that while the governing legal framework suffices for now, it may need periodic review as AFR use ramps up (see para 97).

Third question: was this use of AFR proportionate, by reference to the principles summarised in Bank Mellat? Answer: yes again. The police’s use of AFR struck a fair balance and was not disproportionate. Key points:

  • AFR Locate was deployed in an open and transparent way, with significant public engagement. It was explained on the police’s website, and on Facebook and Twitter feeds about the events at which AFR was deployed; it was also explained on posters on police vehicles at the scene, and in privacy notices available on postcards.
  • On each occasion, AFR was used for a limited time, and covered a limited footprint.
  • AFR was used for the specific and limited purpose of seeking to identify particular individuals who may have been in the area and whose presence was of justifiable interest to the police.
  • There is no ‘computer says arrest the baddie’ procedure here. Where the AFR system identifies a possible match, a human police officer must decide, what – if any – action to take.
  • Nobody was wrongly arrested. Nobody complained as to their treatment (save for the claimant on a point of principle).
  • Any interference with the claimant’s Article 8 rights would have been very limited, given the near-instantaneous algorithmic processing and deletion of his biometric data. No personal information relating to the claimant would have been available to any police officer, or to any human agent. None of his data would be retained. There was no attempt to identify or approach him.
  • Granular retention periods were specified in the DPIA (see para 38).

So, this was a case in which Article 8(1) rights were interfered with, but that interference was justified by the public interest in harnessing of new technologies to aid the detection and prevention of crime, deployed here in a lawful and proportionate way. As ever, the Article 8 analysis carries over to the data protection analysis to a significant extent.

The data protection claims

The first issue was whether the biometric data captured by the AFR cameras was the personal data of persons who are not on watchlists. On this old chestnut (personal data or not), paras 115-127 of the judgment are well worth reading.

The Court considered the two ways in which this biometric data could be considered the personal data of those for whom there was no match, namely (a) indirect identification (applying e.g. Breyer) and (b) individuation (i.e. it singling the individual out and distinguishing him from all others, as discussed e.g. in Vidal-Hall). In this case, the Court held that the police’s use of AFR did involve the processing of the personal data of individuals who were not matched – on the “individuation” analysis, but not the “indirect identification” analysis.

The second issue: did this processing contravene the first data protection principle under the DPA 1998? Answer: no, for basically the same reasons as discussed under Article 8 ECHR. Interestingly, the Court thought that the most suitable condition from Schedule 2 to the DPA 1998 was probably legitimate interests.

The third issue: was this “sensitive processing” within the meaning of section 35(8) of the DPA 2018? (Sorry, what do you mean, I hear you ask? Remember that we are law enforcement rather than GDPR territory here; section 35 DPA 2018 contains in effect the first data protection principle, including provision for what would be the processing of “special category data” in GDPR speak).

Answer: yes, this processing of biometric data is “sensitive processing”. “Although SWP’s overall purpose is to identify the persons on the watchlist, in order to achieve that overall purpose, the biometric information of members of the public must also be processed so that each is also uniquely identified, i.e. in order to achieve a comparison” (para 133).

Fourth issue: was that processing justified under section 35 DPA 2018? Answer (you guessed it): yes. It was “strictly necessary” for law enforcement purposes, it satisfied a condition from Schedule 8 to the DPA 2018, and the police had an appropriate policy document in place that – just about – complied with section 42 DPA 2018.

That last point is a bit niche, but worth noting, given that “appropriate policy document” is one of those novel procedural safeguards introduced under the DPA 2018, across both GDPR and law enforcement regimes. Basically, the Court seemed to think the document in this place was not great, but just about cut the mustard (see paras 140-141).

The final data protection issue is also about procedural safeguards in our new (post-GDPR world), namely Data Protection Impact Assessments (again, important under both law enforcement and GDPR processing). Interestingly, the Court was reluctant to scrutinise this too closely. See para 146 on its approach (which may be comforting to data controllers worried about challenges to their DPIAs):

  • “What is required is compliance itself, i.e. not simply an attempt to comply that falls within a range of reasonable conduct. However, when determining whether the steps taken by the data controller meet the requirements of section 64, the Court will not necessarily substitute its own view for that of the data controller on all matters. The notion of an assessment brings with it a requirement to exercise reasonable judgement based on reasonable enquiry and consideration. If it is apparent that a data controller has approached its task on a footing that is demonstrably false, or in a manner that is clearly lacking, then the conclusion should be that there has been a failure to meet section 64 obligation. However, when conscientious assessment has been brought to bear, any attempt by a court to second-guess that assessment will overstep the mark.”

Whatever your gut feel on the overall balance, the Divisional Court has added real sophistication to the “algorithm of the law” in its race to keep up with technology.

Robin Hopkins (@hopkinsrobin) is a barrister at 11KBW. He can be contacted This email address is being protected from spambots. You need JavaScript enabled to view it..

* This article first appeared in the set's Panopticon blog.

Sponsored Editorial

Need a transcript or recording?

Are you a Paralegal or a Legal Officer? Have you been asked to obtain a transcript of a recording for use as evidential material? Wondering where to start? Don’t worry – we speak to people in your position every single day – and we’ll be happy to help you too. Whether or not you choose to use our…