DeepSummary
The podcast episode discusses the dangers of online misinformation and disinformation, particularly in the context of the 2022 midterm elections in the United States. The guests, Joe Miller and David Toomey, explain the difference between misinformation (unintentional false information) and disinformation (intentional spread of false information to mislead). They highlight how social media platforms have amplified the spread of false information about voting procedures, election fraud claims, and harassment of election workers.
The conversation explores the legal responsibilities of social media companies in moderating and removing harmful content, as well as the challenges they face in enforcing their own policies consistently. The guests argue that these platforms have a duty of care to protect users from false and harmful information, as the First Amendment was never intended to protect such speech.
The episode also delves into the need for diverse media ownership, local news sources, and efforts by civil rights organizations to combat misinformation and disinformation. The guests encourage users to report false information and advocate for more socially responsible practices by these companies, emphasizing the importance of collective action and public pressure.
Key Episodes Takeaways
- Misinformation and disinformation are distinct problems, with misinformation referring to unintentional false information and disinformation being intentional spread of false information to mislead.
- Social media platforms have amplified the spread of misinformation and disinformation, particularly around voting procedures, election fraud claims, and harassment of election workers during the 2022 midterm elections.
- Social media companies have a legal and ethical responsibility to moderate harmful content and enforce their own policies consistently, as the First Amendment does not protect false and harmful speech.
- There is a need for diverse media ownership and local news sources to combat misinformation and disinformation, as well as efforts by civil rights organizations to advocate for responsible content moderation practices.
- Users can play a role in reporting false information and advocating for more socially responsible practices by social media companies, through collective action and public pressure.
- The business models of social media companies, which prioritize engagement and advertising revenue, can contribute to the spread of misinformation and disinformation, presenting a challenge for content moderation efforts.
- Recent changes at Twitter, including layoffs and the potential relaxation of content moderation policies, could exacerbate the problem of misinformation and disinformation on the platform.
- Collective action and advocacy efforts by civil rights organizations and users are crucial in holding social media companies accountable and promoting responsible content moderation practices.
Top Episodes Quotes
- “Fortunately, we didn't see outcomes anywhere near what we feared in November. But this continues to be a problem whether we're talking about here in the US or we're talking abroad and russian propaganda happening in and around places like Ukraine.“ by Joe Miller
- “We really do need more of our own platforms. Thats always been the issue.“ by Joe Miller
- “To the extent they have put some policies in place to deal with hate speech, voting, other issues, there are folks there who want to do a content moderation. They know that if they don't have some parameters on it, it could just be a total free for all that we could see happening with Twitter eventually.“ by David Toomey
- “One thing we've done as a civil rights organization is we've tried to stay engaged with the platforms because we think we've been very public with them about things we'd like them to change and that kind of thing.“ by David Toomey
Entities
Person
Organization
Event
Episode Information
Pod for the Cause
The Leadership Conference on Civil and Human Rights
11/23/22