DeepSummary
The episode discusses the recent issue of non-consensual AI-generated pornographic images of Taylor Swift going viral on the social media platform X (formerly Twitter). Emmanuel Maiberg, a journalist covering deepfakes and artificial intelligence, explains how users found loopholes in Microsoft's AI image generator Designer to create the explicit images of Swift, bypassing content filters.
Maiberg provides insight into the online communities dedicated to sharing and discussing non-consensual pornography, where the Swift images originated. He highlights the challenges regular people face in combating such abusive content and the lack of effective moderation on platforms like X. Maiberg also draws parallels to the evolution of content moderation on Pornhub, suggesting a similar trajectory may occur with AI-generated explicit content.
The episode delves into the potential for AI technology to be misused for harmful purposes, particularly targeting women. Maiberg emphasizes the need for stricter regulation and oversight, predicting that a significant incident may be necessary to spur meaningful changes from tech companies and policymakers.
Key Episodes Takeaways
- Non-consensual AI-generated explicit content, particularly targeting women, is an emerging issue enabled by advancements in AI technology and lax content moderation on platforms.
- Online communities dedicated to sharing non-consensual pornography have found ways to bypass filters and create explicit AI-generated images of celebrities like Taylor Swift.
- Regular individuals face significant challenges in combating the spread of abusive AI-generated content targeting them, lacking the resources and attention garnered by high-profile cases.
- Meaningful regulation and changes from tech companies may require a particularly egregious incident involving AI-generated explicit content to spur action.
- While AI has many practical applications, its development and progress is significantly driven by the demand for generating pornographic content, according to the journalist's perspective.
- The issue of AI-generated explicit content parallels the evolution of content moderation on platforms like Pornhub, where major changes occurred after significant public pressure and lawsuits.
- Stricter oversight and guardrails are needed to prevent the misuse of AI technology for harmful purposes, particularly targeting women.
- Celebrity cases like Taylor Swift's can bring widespread attention to the issue, but addressing the broader problem requires systemic changes in content moderation and AI governance.
Top Episodes Quotes
- “We've been reporting on this since 2017 the entire time. Taylor Swift is one of the most deep faked people, and you've never seen a story blow up like this. And it's not as if people weren't trying to post it to Twitter back in the day x. Today they were. It's just that the generative AI tools are there and the moderation is not. So you have this perfect storm of this awful thing going viral.“ by Emmanuel Myberg
- “Part of the amazing thing about the Taylor Swift story is that you do see action. You see action from Microsoft, you see action from X, you see policy efforts, and you're not going to get this as a normal person or a minor.“ by Emmanuel Myberg
- “I truly believe, and I've written this before and I think the numbers show this, that people are saying that AI is doing all these things and it has many legitimate uses. And I don't think it's a flash in the pan. It's a technology that's here to stay. But the driving force of it, I truly believe, is the pornography. And that's what's driving progress.“ by Emmanuel Myberg
Entities
Concept
Person
Company
Product
Episode Information
What Next: TBD | Tech, power, and the future
Slate Podcasts
2/2/24
For all the promise of the technology, one use-case for artificial intelligence reared its ugly head last week: non-consensual pornographic images. As millions of users saw abusive A.I. generated images of Taylor Swift proliferate across X, the pitfalls of this technology became clear.
Guest: Emanuel Maiberg, journalist and co-founder of 404 Media
If you enjoy this show, please consider signing up for Slate Plus. Slate Plus members get benefits like zero ads on any Slate podcast, bonus episodes of shows like Slow Burn and Dear Prudence—and you’ll be supporting the work we do here on What Next TBD. Sign up now at slate.com/whatnextplus to help support our work.
Check out Compiler here.
Learn more about your ad choices. Visit megaphone.fm/adchoices