Updated
Updated · The Guardian · May 7
Home Office acknowledges racial bias in facial recognition technology
Updated
Updated · The Guardian · May 7

Home Office acknowledges racial bias in facial recognition technology

15 articles · Updated · The Guardian · May 7
  • The admission followed tests showing more false positives for black and Asian faces in systems used by police and retailers across England and Wales.
  • Biometrics watchdogs in England, Wales and Scotland say oversight is too weak, with the Information Commissioner’s Office seen as inadequate and a postponed audit of Metropolitan Police use still unscheduled.
  • The government is reviewing the legal framework amid calls for stronger redress for misidentified people, scrutiny of alleged malicious watchlist additions, and wider concerns over privacy, civil liberties and overreach.
Is facial recognition's rollout a new form of systemic discrimination, given its proven bias against minorities?
As retailers and police build secret watchlists, what recourse do innocent citizens have when they are wrongly blacklisted?

Algorithmic Discrimination in UK Policing Facial Recognition: Impact, Backlash, and the Fight for Regulation (2025–2026)

Overview

Between 2025 and 2026, the UK Home Office acknowledged significant racial and gender biases in the Cognitec facial recognition system used by police, caused by non-diverse training data. Testing revealed higher false positive rates for Black and Asian individuals, leading to wrongful arrests and eroding public trust, especially in minority communities. The Home Office's delayed disclosure worsened this mistrust and drew criticism from oversight bodies like the ICO and APCC. In response, a mitigation strategy was launched, including developing a new, less biased algorithm, overhauling police training, and proposing stronger independent oversight. Meanwhile, plans to expand facial recognition into public spaces sparked legal challenges and civil society opposition, fueling parliamentary debates on comprehensive regulation and privacy protections.

...