Security News > 2023 > May > Facial recog system used by Met Police shows racial bias at low thresholds
The UK Parliament has heard that a facial recognition system used by the Metropolitan police during the King's Coronation can exhibit racial bias at certain thresholds.
Speaking to the Science, Innovation and Technology Committee, Dr Tony Mansfield, principal research scientist, National Physical Laboratory, said the NEC-based system used by the Met, the UK's largest police force, was prone to bias against Black individuals on a set of test data created for his investigations.
"We find that if the system is run at low thresholds and easy thresholds, that it does start showing a bias against the Black males and females combined," he told MPs. Mansfield added that he believed the Met did not operate the system at these thresholds.
"I completely understand public concern around [areas] like facial recognition technology and AI, and there's lots of debate going on around that. I've tried introducing facial recognition technology in as careful, proportionate and transparent way possible," she told MPs. Nonetheless, the introduction of facial recognition in policing has attracted criticism in the UK. In 2017, the Met was urged to cancel its planned use of facial recognition software at Notting Hill Carnival, Europe's largest street festival.
In 2018, campaigners Big Brother Watch found 91 per cent of people flagged up on the Met's facial recognition system turned out to be not on the watch list.
"As an advocate of the accountable and proportionate use of new technology by the police I think this lacuna is problematic as much for the police themselves as for the communities they serve," he said.
News URL
https://go.theregister.com/feed/www.theregister.com/2023/05/25/facial_recognition_system_used_by/