Computer engineers develop tool that hopes to protect one's online photos from facial recognition

Before and after photos of Jessica Simpson, Gwyneth Paltrow and Patrick Dempsey that were cloaked by the Fawkes team. PHOTO: NYTIMES
Clearview AI scrapes billions of online photos to build a tool that could lead police from a face to a Facebook account. PHOTO: NYTIMES

NEW YORK (NYTIMES) - In recent years, companies have been prowling the Web for public photos associated with people's names that they can use to build enormous databases of faces and improve their facial-recognition systems, adding to a growing sense that personal privacy is being lost, bit by digital bit.

A startup called Clearview AI, for example, scraped billions of online photos to build a tool for police that could lead them from a face to a Facebook account, revealing a person's identity.

Now researchers are trying to foil those systems. A team of computer engineers at the University of Chicago has developed a tool that disguises photos with pixel-level changes that confuse facial recognition systems.

Named Fawkes in honour of the Guy Fawkes mask favoured by protesters worldwide, the software was made available to developers on the researchers' website last month.

After being discovered by Hacker News, it has been downloaded more than 50,000 times. The researchers are working on a free app version for non-coders, which they hope to make available soon.

The software is not intended to be just a one-off tool for privacy-loving individuals. If deployed across millions of images, it would be a broadside against facial recognition systems, poisoning the accuracy of the data sets they gather from the Web.

"Our goal is to make Clearview go away," said Dr Ben Zhao, a professor of computer science at the University of Chicago.

Fawkes converts an image - or "cloaks" it, in the researchers' parlance - by subtly altering some of the features that facial recognition systems depend on when they construct a person's face print.

In a research paper, reported earlier by OneZero, the team describes "cloaking" photos of actress Gwyneth Paltrow using actor Patrick Dempsey's face, so that a system learning what Paltrow looks like based on those photos would start associating her with some of the features of Dempsey's face.

The changes, usually subtle and not perceptible to the naked eye, would prevent the system from recognising Paltrow when presented with a real, uncloaked photo of her. In testing, the researchers were able to fool facial recognition systems from Amazon, Microsoft and Chinese tech company Megvii.

To test the tool, an NYTimes reporter asked the team to cloak some images of her family and herself. The original and cloaked images were then uploaded to Facebook to see if they fooled the social network's facial recognition system. It worked.

However, the changes to the photos were noticeable to the naked eye.

The researchers had a few explanations for this. One is that the software is designed to match a person with the face template of someone who looks as much unlike him or her as possible, pulling from a database of celebrity faces. That usually ends up being a person of the opposite sex, which leads to obvious problems.

"Women get moustaches, and guys get extra eyelashes or eye shadow," Dr Zhao said. He is enthusiastic about what he calls "privacy armour" and previously helped design a bracelet that stops smart speakers from overhearing conversations.

The team says it plans to tweak the software so that it will no longer subtly change the sex of users.

The researchers said that, ideally, people would start cloaking all the images they uploaded. That would mean a company such as Clearview that scrapes those photos would not be able to create a functioning database because an unidentified photo of a person from the real world would not match the template of that person that Clearview would have built over time from his or her online photos.

But Clearview's chief executive Hoan Ton-That said his company could use images cloaked by Fawkes to improve its ability to make sense of altered images.

"There are billions of unmodified photos on the internet, all on different domain names," he said. "In practice, it's almost certainly too late to perfect a technology like Fawkes and deploy it at scale."

Other experts were also sceptical that Fawkes would work.

Mr Joseph Atick, a facial recognition pioneer who has come to regret the surveillance society he helped to create, said the volume of images of people that they had already made available would be too hard to overcome.

"The cat is out of the bag. We're out there," Mr Atick said. "While I encourage this type of research, I'm highly sceptical this is a solution to solve the problem that we're faced with."

Dr Elizabeth Joh, a law professor at the University of California, Davis, has written about tools like Fawkes as "privacy protests", in which individuals want to thwart surveillance but not for criminal reasons.

She has repeatedly seen what she called a "tired rubric" of surveillance, then counter-surveillance and then anti-counter-surveillance, as new monitoring technologies are introduced.

"People are feeling a sense of privacy exhaustion," Dr Joh said. "There are too many ways that our conventional sense of privacy is being exploited in real life and online."

Join ST's Telegram channel and get the latest breaking news delivered to you.