During the Champions League Final last summer in Cardiff, Wales, the UK police started a pilot facial recognition program designed to control attendees to a database of 500,000 images of people of interest. Almost a year later, The Guardian reports that the pilot produced 2,470 possible matches, of which 2,297 were "false positives."
In a request for registrations (through Wired ]), the South Wales Police revealed that at events such as the 2017 Champions League Final, the system & # 39; Locate & # 39; Automatic Facial Recognition (AFR) scored 2,470 people, with only 173 positive matches. The figures in the report reveal that of the 2,685 alerts from 15 events, only 234 were "True Positive", with another 2,451 false positives. But in its press release, the SWP notes that it has made 2,000 positive matches and that it has used that information to make 450 arrests in the last nine months. We have contacted the SWP to ask about differences in numbers, and will update if we receive a response.
AFR works by taking live transmissions from CCTV cameras mounted at specific locations or in vehicles, and compares faces to a database of 500,000 images. In cases where the system points to someone, an official will ignore it or send agents to speak with the individual in question. "If an incorrect match has been made," the SWP explains, "the officers will explain to the individual what happened and invite them to see the team along with providing them with a fair prosecution notice." The force also says that there has been no change. arrests in case of a false positive.
Explain that no facial recognition program is 100 percent accurate, and that technical problems "will continue to be a common problem for the foreseeable future." The SWP also notes that several of the false positives were the result of quality images provided by other agencies.
Despite the large number of false positive results, the SWP says that the pilot has been a "resounding success" and that the "overall effectiveness of facial recognition has been high". But while the pilot has produced some arrests (the SWP also notes that they have been aware of the privacy risks), Wired cites privacy groups such as Big Brother Watch, who have criticized the technology as a "police tool" dangerously inaccurate "and indicated that they will launch a campaign against technology next month in parliament.