Entertaiment

Facial Recognition Still Doesn't Work Like it Does in Films

Facial Recognition Still Doesn't Work Like it Does in Films

But the biggest differences with this police force are that it has made 15 arrests using facial recognition results, and that twice as many innocent people were significantly affected.

A system of facial recognition software and surveillance cameras used by law enforcement in London has falsely identified innocent people 98 percent of the time, a civil liberties watchdog revealed in a report Tuesday.

"We're seeing ordinary people being asked to produce ID to prove their innocence as police are wrongly identifying thousands of innocent citizens as criminals".

Big Brother Watch is taking the report to Parliament today to launch a campaign calling for police to stop using the controversial technology, branded by the group as "dangerous and inaccurate".

Police have been rolling out the software to be used at major events such as sporting fixtures and music concerts, including a Liam Gallagher concert and worldwide rugby games, aiming to identify wanted criminals and people on watch lists.

There are also serious privacy concerns, with South Wales Police said to have stored images of 2400 innocent people incorrectly matched by facial recognition for a year, without their knowledge.

London's Metropolitan Police has tested AFR at a total of three events, including the city's Notting Hill carnival in 2016 and 2017, and a "Remembrance Sunday" event in November, the watchdog discovered.




Freedom of Information requests by the Big Brother Watch organisation have shown that 91 per cent of so-called "matches" found by South Wales Police's technology were wrong.

IT Pro has approached both the Met and South Wales Police for comment.

Big Brother Watch's campaign, calling on United Kingdom public authorities to immediately stop using automated facial recognition software with surveillance cameras, is backed by David Lammy MP and 15 rights and race equality groups including Article 19, Football Supporters Federation, Index on Censorship, Liberty, Netpol, Police Action Lawyers Group, the Race Equality Foundation, and Runnymede Trust.

"If we move forward on this path, these systems will mistakenly identify innocent people as criminals or terrorists and will be used by unscrupulous governments to silence unwelcome voices".

While she welcomed both the recent appointment of a member of the NPCC to govern the use of facial recognition in public spaces and the establishment of an oversight panel including herself, Biometrics Commissioner and the Surveillance Camera Commissioner, Denham also noted that she is "deeply concerned about the absence of national level co-ordination in assessing the privacy risks and a comprehensive governance framework to oversee FRT deployment".

The UK already has one of the world's largest CCTV networks.

This is basically what happened with the Met Police's facial recognition system, too. Police plan to use it at music festivals and other events. Innocent citizens being constantly tracked, located and identified - or, as is now most likely, misidentified as a criminal - by an artificially intelligent camera system conjures up images of futuristic dystopian societies that even Orwell would be proud of.