Posted on 01/26/2024 6:12:45 PM PST by Gena Bukin
SAG-AFTRA deplored the AI-generated graphic images of Taylor Swift that went viral on X (formerly Twitter) this week, calling the content “upsetting, harmful, and deeply concerning” in a statement issued on Friday.
“The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal,” the union said, while also calling support to Congressman Joe Morelle’s Preventing Deepfakes of Intimate Images Act to combat the practice. “As a society, we have it in our power to control these technologies, but we must act now before it is too late. We support Taylor, and women everywhere who are the victims of this kind of theft of their privacy and right to autonomy.”
The images spurred significant conversation online and on Capitol Hill this week about the need for more protections against artificial intelligence and content moderation. One the tweets with an AI-generated explicit image of the pop star remained on the platform for about 17 hours and garnered 45 million views before it was finally removed.
When asked about the incident during a press conference on Friday, White House press secretary Karine Jean-Pierre called the images “alarming.” Jean-Pierre called upon social media companies to more strongly enforce content moderation policies and further said Congress should take action to pass protective legislation.
“We are alarmed by the reports of the circulation of images that you laid out, false images to be more exact,” Jean-Pierre said. “While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spreading of misinformation. Of course Congress should take legislative action. That’s how you deal with some of these issues.”
Swift is far from the only woman, famous or otherwise, to have been subjected to AI-generated explicit images of themselves without their permission. As Rolling Stone reported this week, while Swifties have been vocal in the need for more protections to prevent this from happening, such actions likely won’t come very easily.
AI deepfake porn is among the more disturbing uses of the technology to have hit prominent entertainment figures, though the tech’s also been used to create new material without artists’ permission. George Carlin’s estate for instance, filed a lawsuit this week over the unauthorized use of his works to make a new comedy special. The AI drake and the Weeknd song “Heart on My Sleeve” caused a significant stir last year as well, with UMG pressuring streaming platforms to take down infringing content that uses AI.
AI was one of the key negotiating points during SAG’s strike last year, and the union also gave its support the Human Artistry Campaign, an initiative from some of the most powerful music and entertainment groups in the world set on prioritizing human creativity and ensuring AI doesn’t outsource art.
Anyhow, links at the source and pics can be found on Twitter. I'm not going to post any here. You can use your imagination.
Imagine that these were images of Donald Trump…… think there would be this outrage?
“The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal,”
My first reaction to that was laughter.
The AI genie is already out of the bottle, SAG-AFTRA.
Indeed, granted these new AI deepfakes are an entire order of magnitude better than the old photoshop hack jobs of 20+ years ago.
And I'm not really sure what legislation can address anything produced or distributed outside of the United States. Probably the best they can ask for is that major social media and other popular websites voluntarily remove them. Of course, they'll still be easily found elsewhere.
The article doesn’t even explain who SAG-AFTRA is.
Or Taylor Swift!
Or not. :-P
Remember how the porn world made a movie about Palin....not a peep from the Left.
As vile as those images might be, this is a First Amendment issue. The only possible exception might be if such images are clearly presented as genuine, when they are not.
You don’t want to be the target of such filth? Fine. Fair enough. Then don’t be a famous actor or singer. Go be a plumber or forest ranger.
There's been fake Taylor Swift nudes for her entire career. It's kinda weird how this just because a thing this past week.
I doubt it.
I believe Breitbart already posted one or two of them. I’m surprised they were able to at all.
Or a high school student.
Teen boys at New Jersey school accused of creating AI deepfake nudes of female classmates
I overlooked this part of your post.
Yes, this could be a "Flynt vs. Falwell" type of case. No matter how good the deepfakes are, no reasonable person believes that's really Taylor Swift getting gangbanged by five black dudes.
There's no First Amendment protection for libel, slander, or defamation.
Telling someone not to be a famous singer or actor is not the answer.
Does someone have a First Amendment right to create a fake picture of you burning a Koran?
No, and they don't have a right to create a fake picture of you to defame you either.
That high school incident raises an interesting legal point. The girls should sue. The 1A does not apply here. And that’s because no government would be going after those boys. The girls would be.
Contrast that with those folks who want the government to step in, and make a law prohibiting fakes. They are trying to criminalize what should be a civil matter.
The first one were pretty tame until they started whining.
The next ones were pretty raunchy.
The ones showing AOC were funny, never knew she was such a SLUT.
> There’s no First Amendment protection for libel, slander, or defamation. <
Right. But please see my post #18.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.