After the deadly terrorist shooting, in France’s famous Strasbourg Christmas market, the U.K.’s Metropolitan Police will use facial recognition on Christmas shoppers.

Live facial recognition (LFR) technology can identify people from digital images. When people walk through an area under video surveillance, their faces are compared to a database of people wanted by the police and courts. This will be at least the seventh time the police have acknowledged using LFR.


According to U.K. civil liberties group Big Brother Watch, the facial recognition technology doesn’t work well. In one trial use at the Notting Hill Carnival, 95 people reportedly were misidentified as criminals.

“Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the U.K. Members of the public could be tracked, located and identified—or misidentified—everywhere they go,” said director of Big Brother Watch, Silkie Carlo, according to the New York Post.

According to the report, South Wales Police store images of people incorrectly tagged as criminals for a year without their knowledge. The South Wales implementation alone has cost a reported $3.5 million.

There have been other reports of problems with facial recognition. Earlier this year, the American Civil Liberties Union tried Amazon’s facial recognition system on members of Congress, 28 of whom were incorrectly matched to police mugshots.


By Erik Sherman

This article was originally published on Fortune ( and has been republished under Creative Commons