Court of Appeal decides partially in favour of Bridges in facial recognition case

The Court of Appeal this morning published its decision in the case in which Edward Bridges challenged the South Wales Police (SWP) over the lawfulness of its use of a CCTV facial recognition matching system. It works by extracting faces captured in a live feed from a camera automatically comparing them to faces on a watchlist. If no match is detected, the software will automatically delete the facial image captured from the live feed. If a match is detected, the technology produces an alert and the person responsible for the technology, usually a police officer, will review the images to determine whether to make an intervention.

Bridges was not included on an SWP watchlist in its deployments of the CCTV facial recognition matching system. However, he argued that given his proximity to the cameras, his image was recorded by the CCTV system, even if deleted almost immediately afterwards. SWP did not contest this point.

Bridges brought a claim for judicial review, assisted by Liberty, the human rights organisation, on the basis that the CCTV facial recognition was not compatible with:

  1. the right to respect for private life under Article 8 of the European Convention on Human Rights,
  2. data protection legislation, and
  3. the Public Sector Equality Duty (“PSED”) under section 149 of the Equality Act 2010.

On 4 September 2019 the Divisional Court (“DC”) dismissed Mr Bridges’s claim for judicial review on all grounds.

However, in this decision, the Court of Appeal took a more nuanced approach to each separate point, as follows:

  1. The appeal succeeded as the Court held that although the legal framework comprised primary legislation (DPA 2018), secondary legislation (The Surveillance Camera Code of Practice), and local policies promulgated by SWP, there was no clear guidance on where the facial recognition system could be used and who could be put on a watchlist.
  2. The appeal failed in that the benefits were potentially great, and the impact on Mr Bridges was minor, and so the use of the facial recognition system was proportionate.
  3. The appeal succeeded on the ground that the Data Protection Impact Assessment, required by the Data Protection Act 2018, was inadequate as it had assumed that Art. 8 of the European Convention on Human Rights was not infringed.
  4. The Divisional Court did not need to decide whether the SWP had in place an “appropriate policy document” within the meaning of section 42 DPA 2018.
  5. The appeal succeeded in that the SWP failed to take reasonable steps to make enquiries about whether the facial recognition software had bias on racial or sex grounds. The Court did note, however, that there was no clear evidence that software was in fact biased on the grounds of race and/or sex.

The SWP has confirmed that it does not seek to appeal against this judgment.

The judgment