Posted on 05/31/2019 6:20:47 AM PDT by Red Badger
Pornography online attracts millions of erotica-hungry people ready to see sex on-demand. You can simply ask your phone to show you anything you desire and there it is: any time, any place. With the advent of deepfake porn, the possibilities have expanded even further, with people who have never starred in adult films looking as though theyre doing sexual acts on camera. Experts have warned that these videos enable all sorts of bad things to happen, from paedophilia to fabricated revenge porn. What are deepfakes? Deepfakes are videos and images that use deep learning AI to forge something not actually there. This can be done to make a fake speech to misrepresent a politicians views, or to create porn videos featuring people who did not star in them. Theyre made in two ways. Using a generative adversarial network or GAN. This is a type of AI that has two parts; one which creates the fake images, and one that works out how realistic it is, learning from its past mistakes Autoencoders are another way to create deepfakes. These are neural networks that can learn all the features of a given image then decode those features so they can change the image These methods vary in efficacy and quality, with GANs giving less blurry results but being trickier to train. Samsung recently created an AI that was able to make deepfake videos using single images, including the Mona Lisa and the Girl With A Pearl Earring. We saw these iconic paintings smiling, talking, and looking completely alive. In recent weeks, there has been an explosion of face swapping content, with Snapchat and FaceApp (among others) releasing realistic filters that allowed you to see your looks as the opposite gender, as well as previous ageing filters going viral once more. For all the fun, however, is a darker side to using AI to create deepfakes. A number of celebrities have had their faces superimposed onto pornographic videos, with the likes of Selena Gomez, Emma Watson, Scarlett Johansson, and Jennie from girl group Blackpink falling victim. Deepfakes of Donald Trump and Barack Obama have been made and there are concerns that they could be used to undermine democracy as well as peoples personal privacy. DARPA in the US has spent millions on media forensics to thwart these videos, working with academics across the world to detect whats real and otherwise. But, according to Hany Farid, a Dartmouth College computer-science professor who advises a similar forensic fake-spotter service called Truepic, specialists working to build these systems are still totally outgunned. In the UK, there is no specific legislation against deepfakes (but those distributing videos can be charged with harassment), bringing calls for more stringent laws on altered images. In principle, it makes sense that someone could claim that their likeness was used with malicious intent, and this could be tried as defamation or under a false light tort in the US. Cases could also be brought under revenge porn laws, or as identity theft or cyber-stalking. In the US, the legal options are small but potent if (big if) one has the funds to hire an attorney and one can find the creator, Danielle Citron, professor at the University of Maryland, tells Metro.co.uk. Defamation and intentional infliction of emotional distress are potential claims.
Professor Clare McGlynn, from Durham Law School, Durham University, says that the ambiguity here in the UK in using terms like fakeporn, revenge porn, and harassment can leave victims without recourse. There is no one law that covers the making of fakeporn without consent, Professor McGlynn says. In some situations, it will be possible to bring a civil action for damages as it may be a breach of privacy or defamatory The problem is that these laws are not well-known, to either police or victims, and dont cover all situations of fakeporn; and bringing a civil action can be expensive. One of the many problems with the term revenge porn is that it only refers to the sharing/distribution of sexual images. Fakeporn is about images being created without consent. Google added involuntary synthetic pornography to its ban list recently, meaning anyone can request that deepfake images of themselves are removed from the search engine. Whether the perpetrators will be able to be found, given the secrecy of the dark web and lengths people go to remain online, remains to be seen. The legalities of it are just one end of the problem. The personal ramifications of deepfake porn can be catastrophic. Indian journalist Rana Ayyub was targeted in a deepfake porn plot last year, which she said left her throwing up and crying and was a method to silence her. Scarlett Johansson stated in the Washington Post that, while the videos of her were demeaning, this doesnt affect [her] as much because people assume its not actually [her] in a porno. Non-celebrities wont have the same cachet and money to fight back. What would you do if a realistic-looking porn video of you was sent to your family or workplace? The extent that revenge porn already ruins lives is no secret, so when any act can be simulated to look as if youre involved (and powers that be are unable to verify or disprove it), how do you protect your reputation? There is no way to fully protect against deepfakes being made against you. If one has shared [any] photos then the risk is there, Prof Citron says but the technology at present means its difficult to make realistic videos of normal people.
Its not to say it wont happen, but most accessible tech now needs a whole lot of footage to learn from and a few Facebook videos and Boomerangs on Instagram just wont cut it. Professor McGlynn believes that we need to take action in the meantime to ensure this problem is minimised once the tech is more readily available: We need to recognise that the harms of having fakeporn made and shared are just as great as real images. Unless action is taken, perpetrators will be able to act with impunity simply creating fakeporn, rather than real images. She recommends a comprehensive law that covers all forms of image-based sexual abuse similar to that in Scotland (which already covers image altering), saying: Such a law can help to future-proof against new ways of using technology to harm and harass. Reddit already banned the deepfake subreddit from their site but still have a safe for work forum available for people to share work that stays within the law. On other messageboard sites, it is not hard to find high quality fake videos of a number of people (including game characters). At present, although web giants like Reddit and Google are trying to control deepfake proliferation, the technology is moving faster than they ever can. While web users may enjoy swapping faces with their favourite singer or seeing Vladimir Putin do a dance in a deepfake video, the potential is much darker. When that becomes privacy breaches, blurred lines between reality and lies, and ruined reputations (which the likes of Rana Ayyub, or 24-year-old deepfake revenge porn victim Noelle Martin may argue it already has), it will be up to governments and tech companies to catch up with the fast-moving technology as quickly as they can.
You can take some innocent photographs and turn them into whatever you wish..............
Tech Ping!...................
So please ignore the upcoming film of Bill and Hillary Clinton with children.
Back in the sixties I told my father that in the future , with a computer Id be able to show his face in two places at the same time. He laughed
The Mockingbird State Media has a new tool with which to create fake news attacks on its perceived political enemies. Pass around a video of President Trump making hitler salutes? No problem. The media will run with it for weeks, and when it's finally proven to be fake, they won't even issue a retractions. They'll just dismiss it as old news or justify it because he can't prove he didn't do a nazi salute somewhere at sometime in his past. (Mueller's new legal standard)
I read that some (very sick) folks are putting 12 year old girls’ heads on adult bodies and also putting 12 year old bodies with an adult head, perhaps a movie star or such.
And yes, for nefarious pornographic purposes.
I’m no saint when it comes to porn.
But I have ACTIVELY decided not to look at it.
Everyone here is either as old as I am or older pretty so there is no preaching to be done :)
But it ####s you up.
It distorts reality.
It doesn’t make you feel so great about yourself afterwards.
AND THERE’S A LIVE WOMAN IN THE HOUSE FOR GOD’S SAKE! :)
That I DO find attractive! There’s the rub. No pun intended.
It’s heroin or crystal meth in that it will kill you physically.
There’s other ways of dying.
Hardcore porn is in general bad news.
I remember saying, back in the 80’s, that one thing completely unbelieveable in the book, 1984, was a TV that would watch you. The computer hardware could simply not manage it, as far as I was concerned.
Now it’s worse that even Orwell ever imagined. And they don’t have to “force” you to leave it on. Rather, you are incentive to, and you forget that it is even there.
And then there is Alexa...
Yes, exactly. What we have is not really Orwell... it is much more Huxley's prophesies that turned out to be correct.
Horrible. As if porn wasn’t destructive enough to men and marriages to begin with.
Porn is not about sex. It's about fantasy, control, violence.
The general population needs to know this technology exists and will be used in the future to blackmail and extort from people. They need to know that nothing is impossible to make seem real.
Used to be ‘Show me video or it didn’t happen’, now even that standard is gone..................
Two can play that game. How about AOC pulling a train?
Now, it doesn't even have to be 'ON'. Technology and software can access the electronics to selectively turn on just the microphone or camera or whatever they wish................
Back in the sixties, a ‘computer’ was an entire floor in a huge building, with guys wearing white coats and air conditioners the size of boxcars.
Today’s smart watches are more powerful than they were..........
The heart of man is desperately wicked, who can know it? Ai and CGI will no doubt aid the promotion of the big lie, after the great departure.
And in 10, 20 or 30 years?
The world it is a changing. And not in a good way for the individual human being. It’s one reason I don’t think the Lord is gonna wait much longer. it’s about run it’s course. And it is changing exponentially.
Well, its already true that the Presidents words as reported by NBC News cannot be believed as authentic. The MSM has been doing DeepFake political reporting for a long time - it is just now moving into the video realm. Eventually, only releases from an authorized channel will be reliable,
not that you’ve thought about it much.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.