Latest facial recognition software can identify you even if your face is COVERED, exchanging even more privacy for “safety”

Thursday, September 14, 2017 by

Soon it will be impossible to cover up your face and hide your identity as you engage in a criminal activity, thanks to up-and-coming facial recognition technology.

The bad news is, you won’t be able to hide in plain view either, just to protect your privacy.

As reported by the UK’s Daily Mail, the technology under development has already progressed far enough to virtually “unmask” people in most situations. The Disguised Face Identification (DFI) system employs an AI network as it maps facial features hidden behind scarves, head gear, and even fake beards and mustaches to identify people.

No doubt the system can be integrated with criminal databases so that flagging of wanted people can be done instantaneously; in fact, such systems already exist for automobiles. As Natural News has reported as far back as 2013, police departments have been using license plate readers that allow cops to instantly identify people wanted for various crimes as they drive by their vehicles.

Police aren’t concerned about privacy and the incredible amount of hackable data being collected by the readers. Rather, they’re more concerned with revenues: As the Boston Globe reported in May 2013, one $24,000 plate reader paid for itself in just 11 days. “We located more uninsured vehicles in our first month . . . using [the camera] in one cruiser than the entire department did the whole year before,” said Boston PD Sgt. Robert Griffin.

Now, authorities want to take instant database identification a big step further with new facial recognition technology, which will put a quick end to remaining anonymous in public.

“This is very interesting for law enforcement and other organizations that want to capture criminals,” said Amarjot Singh, a University of Cambridge researcher who helped develop DIF technologies, in an interview with Inverse.

Here’s how the technology works: DFI utilizes a deep-learning AI neural network the research team ‘trained’ by inputting images of test subjects using several different kinds of disguises. In addition, images fed into the network included simple and complex backgrounds that challenged the AI to identify disguised features under a variety of scenarios.

Notes the Daily Mail:

AI identifies people by measuring the distances and angle between 14 facial points — ten for the eyes, three for the lips, and one for the nose.

It uses these readings to estimate the hidden facial structure, and then compares this with learned images to unveil the person’s true identity.

Good, you say. In this age of masked Antifa terrorists, it will be good for police to have the technology to identify who is actually responsible for attacking other people, burning cars, and destroying businesses. (Related: America’s universities now becoming terrorist training hubs for Antifa.)

But what about when the technology misidentifies someone as being guilty of committing a crime or act of violence? Because that’s bound to happen; no technology is 100-percent effective or, in this case, foolproof.

Also, there is so much potential for abuse with this technology. If it is deployed widely, authorities will literally be able to track you no matter where you go.

Plus, this technology dramatically alters the relationship between American citizens and all levels of government. Our founders and subsequent generations established a system of justice that presumes innocence until one can be proven guilty; technologies like this DFI and license plate readers are changing that paradigm from “presumed guilty until authorities can prove you are innocent with a wash through government criminal databases.”

And, of course, there is the dramatic loss of privacy and the threat in the Internet age of having more of your personal information stolen from yet another database.

“…[T]his is maybe the third or fourth most worrying ML paper I’ve seen recently re: AI and emergent authoritarianism. Historical crossroads,” tweeted Dr. Zeynep Tufekci, a sociologist at the University of North Carolina, in posting the research to Twitter.

“Yes, we can & should nitpick this and all papers but the trend is clear. Ever-increasing new capability that will serve authoritarians well,” he added.

J.D. Heyes is a senior writer for NaturalNews.com and NewsTarget.com, as well as editor of The National Sentinel.

Sources include:

Twitter.com

DailyMail.co.uk

NaturalNews.com



Comments

comments powered by Disqus