Facial recognition technology seems to be getting a bad rap these days. On one hand, it can keep you safe in an airport or a busy city. But on the other hand, it can be used in nefarious ways by governments. We recently heard that Google was going to stop selling its facial recognition technology – or at least for now. But it continues to be used. In fact, many police forces in the United Kingdom are using it despite warnings of high error rates. Most recently, the technology is being used to scan the faces of Christmas shoppers in London, with police hoping to find wanted criminals. But is that taking the technology too far?
This is the seventh time that the Metropolitan Police have trialled facial recognitions software to the public. The technology has previously been used at large events, including Notting Hill Carnival in 2016 and 2017, and Remembrance Day services last year. This year, the technology is being used Monday and Tuesday of this week in Soho, Piccadilly Circus, and Leicester Square — all major shopping areas in the heart of the city. But does this seem like the police are taking things too far in terms of catching criminals? Or do you think that this maybe isn’t enough?
I am going to ask a hard question, which might put some people off, but it has to be said. Is deploying this kind of technology really just a way of letting police off the hook from doing their job? To me, this isn’t actual police work, but rather using technology as a way to identify people. Yes – technology now gives us that opportunity, which we didn’t have before, but is it ok for a police force to put up cameras around the city, and say that they’re doing work? I know that we’re going to get all kinds of comments around this one, but I had to ask the question.
Cameras are fixed to lamp posts, or on top of vans and the software was developed by the Japanese firm NEC in order to measure the structure of passing faces. Once a face has been scanned, it will then get compared to a database of police mugshots. The Metropolitan Police say that a match using the software will prompt the officers to examine the individual and decide whether or not to stop them. There are posters informing the public that Big Brother is watching and that they will be using this technology.
But here’s the catch you, as an individual, can decline being scanned! So maybe my earlier question wasn’t so far off. Privacy advocates have come out strongly opposed to this technology’s use in the UK. In fact, the group Big Brother Watch has described the police’s justification for using this software as “misleading, incompetent, and authoritarian”. I, however, just think it’s lazy police work.
Not only that, but the technology has a high rate of error. According to data released under the UK’s Freedom of Information laws, 98 percent of “matches” made by the Metropolitan Police using facial recognition were mistakes. Despite this, police commissioner Cressida Dick said in July she was “completely comfortable” with the trials. I am worried what this could lead to, however, because there are no regulations around this (at least not in the United States) there isn’t much we can do about it right now.