Free Republic
Browse · Search
General/Chat
Topics · Post Article

Skip to comments.

How to stop AI from recognizing your face in selfies
MIT Technology Review ^ | 5/5/2021 | Will Douglas Heaven

Posted on 05/12/2021 3:53:23 PM PDT by LibWhacker

Uploading personal photos to the internet can feel like letting go. Who else will have access to them, what will they do with them—and which machine-learning algorithms will they help train?

The company Clearview has already supplied US law enforcement agencies with a facial recognition tool trained on photos of millions of people scraped from the public web. But that was likely just the start. Anyone with basic coding skills can now develop facial recognition software, meaning there is more potential than ever to abuse the tech in everything from sexual harassment and racial discrimination to political oppression and religious persecution. Related Story This is how we lost control of our faces

The largest ever study of facial-recognition data shows how much the rise of deep learning has fueled a loss of privacy.

A number of AI researchers are pushing back and developing ways to make sure AIs can’t learn from personal data. Two of the latest are being presented this week at ICLR, a leading AI conference.

“I don't like people taking things from me that they're not supposed to have,” says Emily Wenger at the University of Chicago, who developed one of the first tools to do this, called Fawkes, with her colleagues last summer: “I guess a lot of us had a similar idea at the same time.”

Data poisoning isn’t new. Actions like deleting data that companies have on you, or deliberating polluting data sets with fake examples, can make it harder for companies to train accurate machine-learning models. But these efforts typically require collective action, with hundreds or thousands of people participating, to make an impact. The difference with these new techniques is that they work on a single person's photos.

“This technology can be used as a key by an individual to lock their data,” says Sarah Erfani at the University of Melbourne in Australia. “It’s a new frontline defense for protecting people’s digital rights in the age of AI.” Hiding in plain sight

Most of the tools, including Fawkes, take the same basic approach. They make tiny changes to an image that are hard to spot with a human eye but throw off an AI, causing it to misidentify who or what it sees in a photo. This technique is very close to a kind of adversarial attack, where small alterations to input data can force deep-learning models to make big mistakes.

Give Fawkes a bunch of selfies and it will add pixel-level perturbations to the images that stop state-of-the-art facial recognition systems from identifying who is in the photos. Unlike previous ways of doing this, such as wearing AI-spoofing face paint, it leaves the images apparently unchanged to humans.

Wenger and her colleagues tested their tool against several widely used commercial facial recognition systems, including Amazon’s AWS Rekognition, Microsoft Azure, and Face++, developed by the Chinese company Megvii Technology. In a small experiment with a data set of 50 images, Fawkes was 100% effective against all of them, preventing models trained on tweaked images of people from later recognizing images of those people in fresh images. The doctored training images had stopped the tools from forming an accurate representation of those people’s faces. Related Story The NYPD used a controversial facial recognition tool. Here’s what you need to know.

Newly-released emails show New York police have been widely using the controversial Clearview AI facial recognition system—and making misleading statements about it.

Fawkes has already been downloaded nearly half a million times from the project website. One user has also built an online version, making it even easier for people to use (though Wenger won’t vouch for third parties using the code, warning: “You don't know what's happening to your data while that person is processing it”). There’s not yet a phone app, but there’s nothing stopping somebody from making one, says Wenger.

Fawkes may keep a new facial recognition system from recognizing you—the next Clearview, say. But it won’t sabotage existing systems that have been trained on your unprotected images already. The tech is improving all the time, however. Wenger thinks that a tool developed by Valeriia Cherepanova and her colleagues at the University of Maryland, one of the teams at ICLR this week, might address this issue.

Called LowKey, the tool expands on Fawkes by applying perturbations to images based on a stronger kind of adversarial attack, which also fools pretrained commercial models. Like Fawkes, LowKey is also available online.

Erfani and her colleagues have added an even bigger twist. Together with Daniel Ma at Deakin University, and researchers at the University of Melbourne and Peking University in Beijing, Erfani has developed a way to turn images into "unlearnable examples," which effectively make an AI ignore your selfies entirely. “I think it’s great,” says Wenger. “Fawkes trains a model to learn something wrong about you, and this tool trains a model to learn nothing about you.” Images of me scraped from the web (top) are turned into unlearnable examples (bottom) that a facial recognition system will ignore. (Credit to Sarah Erfani, Daniel Ma and colleagues)

Unlike Fawkes and its followers, unlearnable examples are not based on adversarial attacks. Instead of introducing changes to an image that force an AI to make a mistake, Ma’s team adds tiny changes that trick an AI into ignoring it during training. When presented with the image later, its evaluation of what’s in it will be no better than a random guess.

Unlearnable examples may prove more effective than adversarial attacks, since they cannot be trained against. The more adversarial examples an AI sees, the better it gets at recognizing them. But because Erfani and her colleagues stop an AI from training on images in the first place, they claim this won’t happen with unlearnable examples.

Wenger is resigned to an ongoing battle, however. Her team recently noticed that Microsoft Azure’s facial recognition service was no longer spoofed by some of their images. “It suddenly somehow became robust to cloaked images that we had generated,” she says. “We don’t know what happened.”

Microsoft may have changed its algorithm, or the AI may simply have seen so many images from people using Fawkes that it learned to recognize them. Either way, Wenger’s team released an update to their tool last week that works against Azure again. “This is another cat-and-mouse arms race,” she says.

For Wenger, this is the story of the internet. “Companies like Clearview are capitalizing on what they perceive to be freely available data and using it to do whatever they want,” she says.”

Regulation might help in the long run, but that won’t stop companies from exploiting loopholes. “There’s always going to be a disconnect between what is legally acceptable and what people actually want,” she says. “Tools like Fawkes fill that gap.”

“Let’s give people some power that they didn’t have before,” she says.


TOPICS: Arts/Photography; Computers/Internet; Reference
KEYWORDS: ai; face; facialrecognition; fawkes; lowkey; privacy; recognize; stop
Navigation: use the links below to view more comments.
first 1-2021-34 next last
There is a little more at the source, including links.
1 posted on 05/12/2021 3:53:23 PM PDT by LibWhacker
[ Post Reply | Private Reply | View Replies]

To: LibWhacker
Fake nose...


2 posted on 05/12/2021 3:56:19 PM PDT by Magnum44 (...against all enemies, foreign and domestic...)
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker

That’s why I only send pics of the little head.


3 posted on 05/12/2021 3:59:13 PM PDT by Lurkina.n.Learnin (The veil of civilization is only 9 meals thick. )
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker

bookmark


4 posted on 05/12/2021 3:59:15 PM PDT by dadfly
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker

My yahoo information has been hacked, as has my Dept of Defense info, my Office of Personnel Management, my bank, and those who might go to my 2007 YouTube channel.

Any additional talk about “information gathering” means spit to me.


5 posted on 05/12/2021 4:00:41 PM PDT by Terry L Smith
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker
Fake beards for the ladies...


6 posted on 05/12/2021 4:00:43 PM PDT by Magnum44 (...against all enemies, foreign and domestic...)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Magnum44

LOL


7 posted on 05/12/2021 4:00:50 PM PDT by LibWhacker
[ Post Reply | Private Reply | To 2 | View Replies]

To: LibWhacker

bttt


8 posted on 05/12/2021 4:01:12 PM PDT by Fungi
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker

If the terminal is slow to accept my credit card at Walmart (which it has been since masks became popular) I look directly into the camera, and it is instantly approved. So it doesn’t really need a full face - I’m guessing eyes and ears are enough.


9 posted on 05/12/2021 4:02:31 PM PDT by PAR35
[ Post Reply | Private Reply | To 1 | View Replies]

To: PAR35

I use facial ID to unlock my IPhone. Doesn’t work while wearing a mask although no problem with glasses and a ball cap.


10 posted on 05/12/2021 4:06:34 PM PDT by cornfedcowboy ( )
[ Post Reply | Private Reply | To 9 | View Replies]

To: LibWhacker; SunkenCiv; upchuck
news you can use


11 posted on 05/12/2021 4:08:18 PM PDT by a fool in paradise (Lean on Joe Biden to follow Donald Trump's example and donate his annual salary to charity. L)
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker

Don’t take selfies?


12 posted on 05/12/2021 4:23:32 PM PDT by Rurudyne (Standup Philosopher)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Magnum44

I’d think it would be something like moving the relative position or orientation of the eyes to other marker points on the face like tip of nose or cheekbones, etc.


13 posted on 05/12/2021 4:27:37 PM PDT by glorgau
[ Post Reply | Private Reply | To 2 | View Replies]

To: LibWhacker

14 posted on 05/12/2021 4:28:32 PM PDT by Spirit of Liberty (Idiots are of two kinds: those who try to be smart and those who think they are smart.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker

Easy. Don’t take selfies. i already know what I look like. Why do I need a picture of myself?


15 posted on 05/12/2021 4:32:48 PM PDT by Organic Panic (Democrats. Memories as short as Joe Biden's eyes.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Noumenon

Ping.


16 posted on 05/12/2021 4:38:23 PM PDT by DuncanWaring (The Lord uses the good ones; the bad ones use the Lord.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker

I know how to do that..dont take em of yourself.


17 posted on 05/12/2021 4:38:55 PM PDT by crz
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker

AI is much more than recognition of a face. We have chips that have an AI that can identify people on video, as people. These chips are cheap, which means that home security can now ignore cats, dogs, raccoons or trash that sets off a motion detector, and focus on human intrusion.

We have facial recognition on cameras that delay the shutter so you don’t catch the subject(s) mid-blink

Yes, it can be used for evil, but it can be used for good as well


18 posted on 05/12/2021 4:41:41 PM PDT by Hodar (A man can fail many times, but he isn't a failure until he begins to blame somebody else.- Burroughs)
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker
Spokane Eye Clinic wanted to add my photo to their files. I declined. Their excuse was, "So we can recognize you when we call your name in the waiting room."

I said that if you call my name, I'm going to hear it and respond. That shut them up. Hmmmmmmm.

19 posted on 05/12/2021 4:46:47 PM PDT by Noumenon (The Second Amendment exists primarily to deal with those who just won't take no for an answer. KTF)
[ Post Reply | Private Reply | To 1 | View Replies]

To: DuncanWaring

Getting worse, isn’t it?


20 posted on 05/12/2021 4:47:15 PM PDT by Noumenon (The Second Amendment exists primarily to deal with those who just won't take no for an answer. KTF)
[ Post Reply | Private Reply | To 16 | View Replies]


Navigation: use the links below to view more comments.
first 1-2021-34 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
General/Chat
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson