05/16/2018 / By Edsel Cook
The body cameras worn by police officers may one day serve as the metaphorical hundred eyes of an artificial intelligence (AI) version of Argus Panoptes. Newly-developed AI software may be able to process surveillance videos taken by those cameras in real time, reported The Wall Street Journal.
According to its creators, the high-speed processing technology will improve the effectiveness of law enforcers. Privacy advocates, on the other hand, warn that persistent AI surveillance will endanger the privacy and lives of innocent individuals.
Body cameras and video surveillance cameras will receive algorithms that can identify potential suspects and inform a human police officer of the fact. The officer can then take appropriate action, like halting or arresting the suspect or even resorting to his firearm. (Related: AI is the “enabling technology” for the coming global surveillance state… you will be watched by artificial intelligence.)
One such system is the result of a partnership between Motorola Solutions and Neurala. The latter has developed AI software for the body-worn cameras produced by the former for law enforcement customers.
The Neurala AI has access to a database that contains photos and characteristics of suspects and missing persons. Information such as gender, age, the color of his or her hair, and distinctive markings are also indexed.
According to its makers, the body camera will scan everyone in its field of view. The AI will examine their faces in real time to see if any matches the description or photo of the suspect or missing person. Upon confirmation of a match, the software notifies the police officer.
The AI-driven facial recognition software learns from experience like a human would. The more information it collects, the faster it gets at identifying subjects.
Critics point out the shortcomings of facial recognition technology as an argument against the adoption of an AI processing system for it. One of these sticking points is the software’s difficulty in telling apart the faces of people with darker skin.
According to a recent MIT Media Lab study, commonly available facial recognition software fails to identify darker-skinned people more often than not. The technology performed better at identifying white male subjects.
Privacy advocates were even more hostile to the idea of AI surveillance, reports The Daily Mail.
“All of the sudden we have lost our ability to be relatively anonymous in society, to be able to walk about without fear that the government is tracking our every move,” warns Electronic Frontier Foundation attorney Jennifer Lynch.
According to Georgetown Law School researchers, the facial profiles of half the adult population of America have already ended up in police databases.
Even the employees of tech firms are aware of the contentious nature of their technology. TaeWoo Kim describes AI-driven facial recognition technology as creepy and dystopian, and he happens to be the chief researcher at tech firm One Smart Labs. Kim believes, though, that AI-processed facial recognition technology can stop crime and terrorism by tracking down suspects.
As of this writing, police departments are restricted to using facial recognition technology to track down suspects by comparing their images to mug shots or driver’s license photos. Tech firms have also promised that they are reducing the risk of biases by training their AI software using publicly-available photos instead of law enforcement databases.
“We’ve worked really hard on training with a diverse data set to make sure that it is balanced and unbiased,” says Paul Steinberg, the chief technology officer of Motorola Solutions.
Find out more about potentially unlawful surveillance at PrivacyWatch.news.
Sources include:
Tagged Under: AI surveillance, artificial intelligence, Big Brother, Body cameras, controversial tech, crime, Facial recognition, facial recognition software, future tech, law enforcement tech, machine learning, Motorola Solutions, Neurala, Police body cameras, police surveillance, policing, privacy, privacy violation, video surveillance