Posted: 4 Min ReadExpert Perspectives

Out of Focus: Digital Privacy and the Ability to Hide Your Photo from Facial Recognition

As concern about privacy grows, we’ve developed ‘Privacy Filters’ that will let consumers protect their digital images from detection

I have been studying and developing technology to protect consumers and enterprises from data misuse for a quarter of a century, and yet when it comes to knowing just who or what services have my personal data, I’m all but clueless.

Even among other specialists in the field of digital privacy, I’m far from alone.

Such is the complex nature of the ever-expanding state of data use in 2019. As people share and upload more and more bits of information, and as an increasing number of devices watch and track us, it’s become almost impossible to know where data about us resides and, more importantly, what’s being done with it.

That’s where Symantec Research Labs (SRL) comes in. As the Global Head of SRL, my team works to figure out what security measures consumers and businesses will need in the coming years and then develops ways to arm them with technology designed to protect data. We all have the right for our data to be digitally protected – and that includes the vast amount of image data of ourselves spreading around the Web and getting identified by photo-recognition software.

Machine Learning and Protecting Facial Privacy

Over the past several years, I’ve had a team working on the various ways machine learning can be used that intersect with digital privacy. Machine leaning has plenty of useful applications, of course, which is why digital services rely on it as they seek to improve and personalize user experiences.

But these capabilities also make machine learning a useful tool for large-scale invasion of privacy which can have harmful consequences for users. A number of social media platforms are used to share pictures while their users are completely unaware that large-scale facial recognition can be used to identify and track them in such pictures. Services like findface.ru use facial recognition to identify individuals in uploaded pictures and has been allegedly used by law-enforcement agencies to crack-down on protestors and suppress individual liberty. The use of facial recognition technology for law-enforcement itself impacts civil liberties  and a number of communities are questioning whether the utility provided by facial recognition outweighs the harm it can cause.

Fortunately, advanced machine learning classifiers that power facial recognition are not perfect and are susceptible to adversarial manipulations which add small-but-specific noise to the input to cause incorrect classification.  In many cases, the manipulations required to fool a classifier are small enough to be perceived by humans.

In theory, people could not just employ such a Privacy Filter to stop social media platforms from detecting their images, but they could also use it maintain privacy from invasive environments or institutions.

Such manipulations can be found easily without requiring internal details of the system being fooled. While such attacks can have serious security implications on the reliability of the machine learning classifier, e.g. visual perception in self-driving cars, studying such adversarial techniques is how we came up with our ‘Privacy Filter,’ which lets users mask their digital portraits from facial recognition. Our Privacy Filter puts an adversarial technique to good use:  adversarial noise is added to a picture – a pixel, say, or an overlay –  that is invisible to the human eye; it also makes the new image unrecognizable to the machines. While the adversary is trying to avoid photo detection for bad purposes, our aim is to help consumers avoid detection for a good purpose, that of strengthening their digital privacy.

The implications of this are enormous. In theory, people could not just employ such a Privacy Filter to stop social media platforms from detecting their images, but they could also use it maintain privacy from invasive environments or institutions. With the ongoing rollout of 5G, and with 18 billion IoT devices expected to be in use worldwide by 2022, it’s safe to assume that devices will process images of your face far more times than already occurs today. And how many times that currently happens is anybody’s guess.

For now, our Privacy Filter is a research prototype that shows it’s possible to protect your image in a world ever more determined to photograph and identify you.

For the most part, people don’t seem overly bothered by services that recognize their photos. Indeed, the technology can be helpful – you upload a batch of photos and the software helps you organize your photos by recognizing the people.

But as is often the case with technological developments, what starts out as helpful eventually becomes a concern. Remember the Russian facial-recognition app FindFace that lets you photograph a stranger and then takes you to their social media profile? I can think of all sorts of reasons one would want their image masked from such a service. Today, FindFace markets itself to governments and businesses, and plenty of other facial-recognition apps exist to help people learn more about strangers. Law enforcement agencies the world over use facial recognition software, as does the oft-stalked Taylor Swift  and, potentially, the stalkers themselves.

For now, our Privacy Filter is a research prototype that shows it’s possible to protect your image in a world ever more determined to photograph and identify you. We will begin testing it on black-box attacks on big services, and eventually we hope to add this to our privacy tools.

You might also enjoy
Expert Perspectives5 Min Read

Digital Privacy and the Right to be Protected

With our digital privacy at risk in unprecedented ways, there’s new urgency to find ways to safeguard people's digital life

About the Author

Dr. Petros Efstathopoulos

Global Head of Symantec Research Labs

Petros joined Symantec Research Labs in 2009 and has focused on next-generation storage/backup systems, portable storage security, network security, privacy and identity. As the Global Head of SRL he is responsible for Lab strategy, direction, and growth.