Britain’s Court of Appeals held Tuesday that the South Wales Police Force’s use of facial recognition technology violates human rights and data protection laws.
The South Wales Police Force (SWP) uses automated facial recognition technology (AFR) to determine whether two facial images show the same person. AFR Locate automatically takes faces from live camera feed and compares them to faces on a watchlist that can hold up to 2,000 images. If no match is detected, the software automatically deletes the images. If a match is detected, the technology issues an alert and the officer reviews the images to decide whether to act.
SWP overtly used AFR Locate 50 times between May 2017 and April 2019 at public events. Because AFR Locate can scan 50 faces per second, an estimated 500,000 faces were scanned during that time.
Appellant Edward Bridges, a civil liberties campaigner, was near two deployments of AFR. Even though he was not on the watchlist, he claimed his image was recorded by the system, even if it was deleted immediately. He brought a claim that AFR violated the right to respect for private life under Article 8 of the European Convention on Human Rights, data protection legislation, and the Public Sector Equality Duty (PSED) of the 2010 Equality Act.
In September 2019, the lower court dismissed the claims on grounds that the interference with the privacy right was proportionate and in accordance with the law. The court dismissed the data protection claims, holding that the PSED was not breached.
On appeal, Britain’s Court of Appeals found that the legal framework was insufficient to constitute a law under Article 8. Individual police officers had too much discretion, and it was unclear who could be placed on the watchlist. It was also unclear whether there were any criteria for determining where AFR could be used.
The court also found that the SWP did not comply with PSED. The goal of PSED was to ensure that public authorities thought about the potentially discriminatory impact of policies, and the court found that SWP did not take reasonable steps to determine whether the software was racially or sexually biased.
Because of this, the court found that the use of AFR Locate violated human rights and data protection laws.