Home / Tech News / Featured Tech News / British facial recognition system is falsely flagging thousands of people as criminals

British facial recognition system is falsely flagging thousands of people as criminals

Facial recognition technology has seen some tremendous success across the globe, helping to catch criminals at high profile events as far as China. Despite its four years in the making, the UK’s effort seems a lot less reliable, however, yielding thousands of false positives during its pilot test last summer.

The Automated Facial Recognition (AFR) ‘Locate’ system was piloted by South Wales Police during 2017’s Champion’s League Final in Cardiff, Wales, where 170,000 audience members were checked against a database of 500,000 persons of interest. According to data on the force’s website, via The Guardian, 2,470 people were flagged by the software, to which around 2,297 to be false positives.

While South Wales Police reports that just 234 “true positives” from 15 events were recorded of the 2,685 alerts, it maintains that “over 2,000 positive matches have been made using our ‘identify’ facial recognition technology, with over 450 arrests.”

“Successful convictions so far include six years in prison for robbery and four-and-a-half years imprisonment for burglary,” reports The Guardian. “The technology has also helped identify vulnerable people in times of crisis.”

South Wales Police states that the offset in data was due to the technology’s first major deployment, not helped by the “poor quality images” supplied by agencies like Uefa and Interpol. “Technical issues are common to all face recognition systems, which means false positives will be an issue as the technology develops. Since initial deployments during the European Champions League final in June 2017, the accuracy of the system used by South Wales police has continued to improve.”

The force emphasises that no arrests have been made by a false positive, with the flagging process alerting an officer on sight to investigate manually. If determined to be “an incorrect match,” then “officers will explain to the individual what has happened and invite them to see the equipment along with providing them with a Fair Processing Notice.”

Contrary to the raw statistics, the force bills the pilot as a “resounding success” despite the ongoing privacy concerns surrounding the technology. Civil liberties campaign group Big Brother Watch in particular has chastised the AFR system, explaining that it plans to challenge the use of it next month in parliament.

Discuss on our Facebook page, or over on Twitter.

KitGuru Says: While this undoubtedly adds an extra layer of protection against wanted criminals, there will always be privacy concerns surrounding the technology, with many wondering what it will be used for beyond catching felons. Do you like the idea of the Automated Facial Recognition system?

Become a Patron!

Check Also

Microsoft will honour timed-exclusive agreements for announced Bethesda games on PS5

Today has been a blockbuster day for Microsoft, following the announcement of a $7.5 billion …