Facial recognition may not be the high-tech policing solution it's purported to be, with new figures showing facial-recognition software used by the UK's Metropolitan Police returned incorrect matches in 98 percent of cases.
According to figures published by The Independent (based on data obtained under freedom of information), only two of the 104 alerts generated by the facial-recognition software used by Met Police were found to be accurate matches.
The Independent also reports software used by South Wales Police has returned more than 2,400 false positives since June 2017.
Facial recognition software is capable of scanning video footage and identifying individual faces to match with a database of known faces, such as wanted criminals. The software reduces each face to a map of biometric identifiers (such as the length of a person's nose and the distance between their eyes) and is capable of making a match in a fraction of a second.
The technology is also in use in countries such as Australia, as well as China, where state media says face-recognising glasses used by police can help law enforcement identify and detain persons of interest in as little as seven minutes.
Speaking to The Independent, the UK's independent Biometrics Commissioner Paul Wiles, said the figures showed the technology "is not yet fit for use."
"In terms of governance, technical development and deployment is running ahead of legislation and these new biometrics urgently need a legislative framework, as already exists for DNA and fingerprints," he said.
'Hello, humans': Google's Duplex could make Assistant the most lifelike AI yet.
Cambridge Analytica: Everything you need to know about Facebook's data mining scandal.
Share your voice
Post a comment
Tech Industry Security Biometrics Facial recognition Privacy