Free Republic
Browse · Search
General/Chat
Topics · Post Article

Skip to comments.

Deepfake porn and the ethics of being able to watch whatever your imagination desires
metro.co.uk ^ | Friday 31 May 2019 8:22 am | Jessica Lindsay

Posted on 05/31/2019 6:20:47 AM PDT by Red Badger

Pornography online attracts millions of erotica-hungry people ready to see sex on-demand. You can simply ask your phone to show you anything you desire and there it is: any time, any place. With the advent of deepfake porn, the possibilities have expanded even further, with people who have never starred in adult films looking as though they’re doing sexual acts on camera. Experts have warned that these videos enable all sorts of bad things to happen, from paedophilia to fabricated revenge porn. What are deepfakes? Deepfakes are videos and images that use deep learning AI to forge something not actually there. This can be done to make a fake speech to misrepresent a politician’s views, or to create porn videos featuring people who did not star in them. They’re made in two ways. Using a generative adversarial network – or GAN. This is a type of AI that has two parts; one which creates the fake images, and one that works out how realistic it is, learning from its past mistakes Autoencoders are another way to create deepfakes. These are neural networks that can learn all the features of a given image then decode those features so they can change the image These methods vary in efficacy and quality, with GANs giving less blurry results but being trickier to train. Samsung recently created an AI that was able to make deepfake videos using single images, including the Mona Lisa and the Girl With A Pearl Earring. We saw these iconic paintings smiling, talking, and looking completely alive. In recent weeks, there has been an explosion of face swapping content, with Snapchat and FaceApp (among others) releasing realistic filters that allowed you to see your looks as the opposite gender, as well as previous ageing filters going viral once more. For all the fun, however, is a darker side to using AI to create deepfakes. A number of celebrities have had their faces superimposed onto pornographic videos, with the likes of Selena Gomez, Emma Watson, Scarlett Johansson, and Jennie from girl group Blackpink falling victim. Deepfakes of Donald Trump and Barack Obama have been made and there are concerns that they could be used to undermine democracy as well as people’s personal privacy. DARPA in the US has spent millions on ‘media forensics’ to thwart these videos, working with academics across the world to detect what’s real and otherwise. But, according to Hany Farid, a Dartmouth College computer-science professor who advises a similar forensic fake-spotter service called Truepic, specialists working to build these systems are ‘still totally outgunned’. In the UK, there is no specific legislation against deepfakes (but those distributing videos can be charged with harassment), bringing calls for more stringent laws on altered images. In principle, it makes sense that someone could claim that their likeness was used with malicious intent, and this could be tried as defamation or under a false light tort in the US. Cases could also be brought under revenge porn laws, or as identity theft or cyber-stalking. ‘In the US, the legal options are small but potent if (big if) one has the funds to hire an attorney and one can find the creator,’ Danielle Citron, professor at the University of Maryland, tells Metro.co.uk. ‘Defamation and intentional infliction of emotional distress are potential claims.’

Professor Clare McGlynn, from Durham Law School, Durham University, says that the ambiguity here in the UK in using terms like fakeporn, revenge porn, and harassment can leave victims without recourse. ‘There is no one law that covers the making of fakeporn without consent,’ Professor McGlynn says. ‘In some situations, it will be possible to bring a civil action for damages as it may be a breach of privacy or defamatory… ‘The problem is that these laws are not well-known, to either police or victims, and don’t cover all situations of fakeporn; and bringing a civil action can be expensive. ‘One of the many problems with the term “revenge porn” is that it only refers to the sharing/distribution of sexual images. Fakeporn is about images being created without consent.’ Google added ‘involuntary synthetic pornography’ to its ban list recently, meaning anyone can request that deepfake images of themselves are removed from the search engine. Whether the perpetrators will be able to be found, given the secrecy of the dark web and lengths people go to remain online, remains to be seen. The legalities of it are just one end of the problem. The personal ramifications of deepfake porn can be catastrophic. Indian journalist Rana Ayyub was targeted in a deepfake porn plot last year, which she said left her throwing up and crying and was a method to ‘silence’ her. Scarlett Johansson stated in the Washington Post that, while the videos of her were ‘demeaning’, ‘this doesn’t affect [her] as much because people assume it’s not actually [her] in a porno’. Non-celebrities won’t have the same cachet and money to fight back. What would you do if a realistic-looking porn video of you was sent to your family or workplace? The extent that revenge porn already ruins lives is no secret, so when any act can be simulated to look as if you’re involved (and powers that be are unable to verify or disprove it), how do you protect your reputation? There is no way to fully protect against deepfakes being made against you. ‘If one has shared [any] photos then the risk is there,’ Prof Citron says but the technology at present means it’s difficult to make realistic videos of ‘normal’ people.

It’s not to say it won’t happen, but most accessible tech now needs a whole lot of footage to learn from and a few Facebook videos and Boomerangs on Instagram just won’t cut it. Professor McGlynn believes that we need to take action in the meantime to ensure this problem is minimised once the tech is more readily available: ‘We need to recognise that the harms of having fakeporn made and shared are just as great as ‘real’ images. ‘Unless action is taken, perpetrators will be able to act with impunity – simply creating fakeporn, rather than ‘real’ images.’ She recommends a ‘comprehensive law that covers all forms of image-based sexual abuse’ similar to that in Scotland (which already covers image altering), saying: ‘Such a law can help to future-proof against new ways of using technology to harm and harass.’ Reddit already banned the deepfake subreddit from their site but still have a safe for work forum available for people to share work that stays within the law. On other messageboard sites, it is not hard to find high quality fake videos of a number of people (including game characters). At present, although web giants like Reddit and Google are trying to control deepfake proliferation, the technology is moving faster than they ever can. While web users may enjoy swapping faces with their favourite singer or seeing Vladimir Putin do a dance in a deepfake video, the potential is much darker. When that becomes privacy breaches, blurred lines between reality and lies, and ruined reputations (which the likes of Rana Ayyub, or 24-year-old deepfake revenge porn victim Noelle Martin may argue it already has), it will be up to governments and tech companies to catch up with the fast-moving technology as quickly as they can.


TOPICS: Arts/Photography; Business/Economy; Education; Music/Entertainment
KEYWORDS: 1984; computergraphics; computers; deepfake; fakenews; graphics; porn; preppers; software; tech
Navigation: use the links below to view more comments.
first previous 1-2021-40 last
To: null and void
Who would watch that?

People into feats of strength and locomotives.

21 posted on 05/31/2019 7:08:23 AM PDT by BipolarBob (AOC is the Democrat prophecy come true : "A bartender will lead them".)
[ Post Reply | Private Reply | To 18 | View Replies]

To: cuban leaf

Go to your local WalMart and look up at the monitor near the door, showing the cameras scanning you thoroughly.

Why, I have no idea...but lately, I try to do a little tap dance just to watch it freak out, trying to keep up with me.


22 posted on 05/31/2019 7:12:02 AM PDT by Salamander (Death makes angels of us all, and give us wings where we once had shoulders, smooth as ravens' claws)
[ Post Reply | Private Reply | To 7 | View Replies]

To: null and void; Seruzawa

If she were pulling a train with her ginormous horse teeth chomped onto a rope, I’d watch that.


23 posted on 05/31/2019 7:14:09 AM PDT by Salamander (Death makes angels of us all, and give us wings where we once had shoulders, smooth as ravens' claws)
[ Post Reply | Private Reply | To 18 | View Replies]

To: cuban leaf
The TV show from a few years back, Person of Interest had the general premise that a computer could gather all the data that crossed the internet and any wireless cameras and phones then essentially filter it in such a way as to determine if people were in physical danger.

It seemed a little far-fetched, but now that we see data gathering and storage capacities becoming virtually limitless, it's really just a matter of time before the evaluation processes become more sophisticated.

24 posted on 05/31/2019 7:16:24 AM PDT by Repealthe17thAmendment
[ Post Reply | Private Reply | To 7 | View Replies]

To: ClearCase_guy

“So please ignore the upcoming film of Bill and Hillary Clinton with children.”

Nailed it.

So far, the Pizzagate group and other SRA groups infesting the halls of power have managed to quash most of the leaks. I suppose it might help that the evidence against them is contraband—illegal for anyone to possess. This means whistleblowers cannot easily disseminate it to expose the perps.


25 posted on 05/31/2019 7:24:00 AM PDT by unlearner (War is coming.)
[ Post Reply | Private Reply | To 3 | View Replies]

To: Mr. Jeeves
Eventually, only releases from an authorized channel will be reliable,

Eventually, even releases from an authorized channel will not be reliable.

'Eventually' being 'today'.

26 posted on 05/31/2019 7:28:53 AM PDT by null and void (The press is always lying. When they aren't actively lying, they are actively concealing the truth.)
[ Post Reply | Private Reply | To 19 | View Replies]

To: Salamander

On a rope would be a lot better than what I can imagine them chomped down on...


27 posted on 05/31/2019 7:31:29 AM PDT by null and void (The press is always lying. When they aren't actively lying, they are actively concealing the truth.)
[ Post Reply | Private Reply | To 23 | View Replies]

To: Red Badger

That lack of paragraphs makes me thing the article was written by AI. What are they up to now?


28 posted on 05/31/2019 7:33:42 AM PDT by DungeonMaster (Prov 24: Do not fret because of evildoers. Do not associate with those given to change.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: ShadowAce

You are right.

Sex involves two people :)


29 posted on 05/31/2019 7:45:01 AM PDT by dp0622 (The Left should know if Trump is kicked out of office, it is WAR!)
[ Post Reply | Private Reply | To 11 | View Replies]

To: DungeonMaster

It was my fault for not previewing and re-formatting......................


30 posted on 05/31/2019 7:51:57 AM PDT by Red Badger (We are headed for a Civil War. It won't be nice like the last one....................)
[ Post Reply | Private Reply | To 28 | View Replies]

To: Salamander

I think you dont know what “pulling a train” means. Google it. Lol.


31 posted on 05/31/2019 8:00:06 AM PDT by Seruzawa (TANSTAAFL!)
[ Post Reply | Private Reply | To 23 | View Replies]

To: ClearCase_guy

Exactly. They know that the truth is coming out and they want to obscure the evidence. Not just the Clintons, either.


32 posted on 05/31/2019 8:02:49 AM PDT by Bigg Red (Beta Male O'Rourke is a fake Mexican.)
[ Post Reply | Private Reply | To 3 | View Replies]

To: cuban leaf
" that one thing completely unbelieveable in the book, 1984, was a TV that would watch you.”

That book is why when I purchased a new computer, I specifically decided NOT to have a screen that had the built in camera.

Still, with my viewing habits, it won't be hard to track my activities....
33 posted on 05/31/2019 8:12:34 AM PDT by RedMonqey (Welcome to Thunderdome... America 2019)
[ Post Reply | Private Reply | To 7 | View Replies]

To: Seruzawa
" How about AOC pulling a train?”

Without the internet, I doubt I would know what "pulling a train' would mean.

Sadly, I do...
34 posted on 05/31/2019 8:14:58 AM PDT by RedMonqey (Welcome to Thunderdome... America 2019)
[ Post Reply | Private Reply | To 13 | View Replies]

To: apillar

They actually had a Star Trek TNG episode like that, where Reg from engineering was creating holodeck programs starring holographic versions his crewmates, including a romantic relationship with Lt. Troi.


35 posted on 05/31/2019 8:54:59 AM PDT by Boogieman
[ Post Reply | Private Reply | To 10 | View Replies]

To: Red Badger
This is why the Democrats can't nominate Kamala Harris.

They don't understand "why" yet, but they will...

36 posted on 05/31/2019 9:11:34 AM PDT by kiryandil (Never pick a fight with an angry beehive)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Red Badger

Then realize that those type of labs still exist and that those replacement s that still take a whole floir can do some truly astonishing things.


37 posted on 05/31/2019 9:55:37 AM PDT by gnarledmaw (Hive minded liberals worship leaders, sovereign conservatives elect servants.)
[ Post Reply | Private Reply | To 15 | View Replies]

To: Salamander

Go ahead … make obscene gestures at the camera.


38 posted on 05/31/2019 10:01:23 AM PDT by NorthMountain (... the right of the peopIe to keep and bear arms shall not be infringed)
[ Post Reply | Private Reply | To 22 | View Replies]

To: Red Badger

I always wondered why the Start Trek hologram deck was always busy. When that technology becomes real people will not want to work anymore.


39 posted on 05/31/2019 12:14:25 PM PDT by seawolf101 (Member LES DEPLORABLES)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Red Badger
"With the advent of high definition computer graphics, you can create anything that your imagination can conjure up."

Somehow the movie Forbidden Planet (1956) comes to mind.

"But the Krell forgot one thing.
Monsters, John.
Monsters from the Id."

40 posted on 05/31/2019 7:10:13 PM PDT by clearcarbon
[ Post Reply | Private Reply | To 1 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-2021-40 last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
General/Chat
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson