Substack’s Moderation Battle: All the Latest News
Substack’s moderation battle has become a contentious issue in the publishing world. Since its launch in 2017, Substack has provided a platform for writers to monetize their work through paid newsletters. However, concerns have arisen regarding the platform’s approach to content moderation. While Substack bans pornography and spam, it has allowed Nazi-supporting and white supremacist newsletters to thrive on its platform, even generating commissions for Substack. This has led to over 200 Substack authors signing an open letter demanding answers as to why this content is allowed. Despite criticism, Substack has refused to make any changes to its policies. Stay tuned for the latest updates on this ongoing moderation battle.
Substack’s Moderation Battle: All the Latest News
Since its launch in 2017, Substack has become a popular platform for writers to monetize their work through paid newsletters. However, in recent months, Substack has been embroiled in a content moderation battle due to its approach to handling controversial content. This article will cover all the latest news regarding Substack’s moderation policies, including an open letter from Substack authors, the response from Substack’s co-founder Hamish McKenzie, recent updates on Substack’s moderation policies, the departure of the publication Platformer from Substack, the removal of Nazi-supporting newsletters, public response to Substack’s moderation policies, the introduction of a report button, criticism of Substack’s moderation stance, and Substack’s statement on bigotry.
Substack’s Approach to Content Moderation
An investigation by The Atlantic in November 2023 revealed that while Substack banned pornography and spam, it allowed Nazi-supporting and white supremacist newsletters on its platform. Some of these newsletters even had paying subscribers that generated commissions for Substack. This discovery raised concerns among Substack authors, leading to an open letter questioning why the platform allows these types of content to be published and monetized.
Open Letter from Substack Authors
Over 200 Substack authors signed an open letter expressing their concerns about the platform’s content moderation policies. They specifically questioned why Substack allowed newsletters that support Nazis to be published and monetized. The letter sought clarification on Substack’s stance and urged the platform to reconsider its policies.
Response from Substack Co-founder Hamish McKenzie
In response to the open letter, Substack co-founder Hamish McKenzie wrote a public response defending the platform’s decision not to remove or demonetize Nazi content. He argued that Substack upholds the principles of free expression and believes in allowing a wide range of perspectives on its platform.
Recent Update on Substack’s Moderation Policies
Following the controversy surrounding Substack’s moderation policies, the platform introduced a report button in its app. This update allows users to report posts and publications that violate Substack’s guidelines. While this addition may address some concerns, there are potential implications for users who remain dissatisfied with Substack’s approach to content moderation.
Platformer Leaves Substack
Platformer, a popular tech newsletter started by Casey Newton, announced its decision to leave Substack due to concerns over the platform’s policies and response to Nazi content. Newton identified several Substack publications that expressed explicit support for 1930s German Nazis and called for violence against Jews and other groups. Although Substack removed some publications on the list, Platformer cited the leaking of off-the-record conversations as another reason for its departure.
Substack Removes Nazi Newsletters
In response to mounting pressure, Substack made the decision to remove some publications that supported Nazis. This move garnered reactions from other platforms and paying customers who had left or were considering leaving Substack. The removal of these newsletters raised questions about Substack’s commitment to content moderation and its handling of extremist views on its platform.
Public Response to Substack’s Moderation Policies
Legal blogger Ken White, also known as Popehat, criticized Substack’s claims about being a principled upholder of free expression. He questioned the platform’s tolerance of extremist content and argued that there is a difference between tolerance and platforming. Additionally, another open letter signed by Substack authors further questioned the platform’s decision to support and monetize Nazi content.
Substack’s Report Button
Following the introduction of a report button in Substack’s app, users can now easily report posts and publications that violate the platform’s guidelines. This addition provides dissatisfied users with a means to address their concerns about Substack’s moderation policies and potentially have problematic content removed.
Criticism of Substack’s Moderation Stance
Substack has faced significant criticism for its moderation stance, particularly regarding its decision to allow Nazi-supporting and white supremacist content on its platform. Critics argue that by allowing such content to be published and monetized, Substack is providing a platform for hate speech and bigotry.
Substack’s Statement on Bigotry
In response to questions about its platforming and monetizing of Nazi content, Substack’s co-founder Hamish McKenzie stated that the company does not condone bigotry in any form. However, no evidence was provided to support this statement, leaving some skeptical of Substack’s commitment to tackling hate speech and extremism.
In conclusion, Substack’s content moderation battle has raised significant concerns among users and the broader public. The introduction of a report button and the removal of Nazi-supporting newsletters are steps in the right direction. However, Substack still faces criticism for its approach to content moderation and its response to extremist views. The platform must continue to address these concerns and outline clear guidelines for acceptable content to maintain the trust and support of its users.