The UK is frequently cited as the nation with the highest level of surveillance. Although there are undoubtedly millions of surveillance cameras in public places now, not to mention private structures and residences, this may or may not have been the case in the past. They are altering behind those lenses in ways that most people are rarely aware of, and the consequences for privacy need to be urgently examined.
With its introduction in several places around the world, including the US, China, Germany, and Singapore, automatic facial recognition is currently the hottest ticket in this business. The police assert that testing the technology to aid in the identification of potential terrorists and other known offenders has been made possible by the piloting of such security systems. However, this needs to be compared to other issues. The most general is whether or not our expectation of privacy and anonymity in public places goes too far in making our every move visible to the state.
The effectiveness of these face recognition technologies nowadays is another concern. It has been demonstrated that their success rate for recognising faces is only 2%. This is related to the software’s inherent bias, which makes it much less effective at detecting women and people with darker skin tones. Therefore, it might make racial relations between the police and ethnic minorities worse.
The police employing “watch list” databases of faces in order to match live images—could make this situation much worse. These databases frequently contain policing photographs of people who have been detained; these individuals may never have been found guilty of a crime and are unlikely to have given their agreement for their data to be used in this manner.
Setting precedents for privacy: Bringing surveillance into the open through legal challenges in the UK
I’m watching you, not me. Due to these factors, using automatic face recognition software has generated a lot of debate. Until the security technology is more dependable, we should probably use it with extreme caution. In recent years, there have been two important pilots in the UK, one each in London and the south of Wales. Both are the focus of judicial review cases that will be resolved in the upcoming months. The cases were brought by the civil liberties organisations Liberty and Big Brother Watch, respectively.
A world of tomorrow
San Francisco, California, in the US, prohibited the use of face recognition in its public services in May. Face recognition technology is already being deployed in places like Chicago, New York, and Detroit, and it is anticipated that other American cities will follow suit. In Canada, where it is being used in Toronto and some other places, the technology has also sparked a lot of discussion.
Face recognition draws attention to more important issues regarding the kinds of monitoring equipment that society should tolerate. The fact that security cameras are becoming more advanced and computerised without necessarily changing their appearance makes this topic more difficult to answer. Because there is no advertising or other information letting us know about their improved talents, it is less clear what is going on behind the scenes.
New types of cameras, such as drones, dash and head cams, body-worn video devices, and dash-mounted cameras, have evolved as technology has become more affordable and miniaturised. In parallel, imaging and recording methods have standardised more and more. As a result, systems are now more connected to one another, and image quality has improved to the point where it can serve as reliable evidence in court processes.
Body cameras are currently in use
Along with advances in noise or smell analysis, we are also witnessing the emergence of cameras that can track and recognise objects in addition to faces. Systems that forecast criminal behaviour are being tested by police departments in the US and the UK. In comparison to the traditional CCTV cameras we are used to, this is a quantum leap forward.
To stay up with this climate, governance and policy must change swiftly. In order to achieve this, the Information Commissioner’s Office, which is in charge of regulating data privacy in the UK, and the specialist office of the Surveillance Camera Commissioner are now in charge of regulating surveillance cameras in England and Wales. The Office of the Biometrics Commissioner is also pertinent to the usage of facial recognition technologies.
Surveillance Camera Day
The majority of surveys indicate that people support simple CCTV cameras, but those who make the laws should consider whether people would still support these systems if they realised what they were capable of. According to the majority of media responses to face recognition, it doesn’t appear so.
I believe that most technological advancements could be used to enhance the system if they were properly regulated, but cameras need to be perceived as being implemented in the interests of society and with voter support. So where ought policymakers to draw the line?
It’s not necessary for surveillance cameras to advance in a technologically predetermined direction where they inevitably become more and more intrusive. Everyone has the chance to influence the conversation on Surveillance Camera Day. It will be intriguing to see how the public and other participants react.