Under pressure from critics who say Substack is profiting from newsletters that promote hate speech and racism, the company’s founders said Thursday that they would not ban Nazi symbols and extremist rhetoric from the platform.
“I just want to make it clear that we don’t like Nazis either — we wish no one held those views,” Hamish McKenzie, a co-founder of Substack, said in a statement. “But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away — in fact, it makes it worse.”
The response came weeks after The Atlantic found that at least 16 Substack newsletters had “overt Nazi symbols” in their logos or graphics, and that white supremacists had been allowed to publish on, and profit from, the platform. Hundreds of newsletter writers signed a letter opposing Substack’s position and threatening to leave. About 100 others signed a letter supporting the company’s stance.
In the statement, Mr. McKenzie said that he and the company’s other founders, Chris Best and Jairaj Sethi, had arrived at the conclusion that censoring or demonetizing the publications would not make the problem of hateful rhetoric go away.
“We believe that supporting individual rights and civil liberties while subjecting ideas to open discourse is the best way to strip bad ideas of their power,” he said.
That stance elicited waves of outrage and criticism, including from popular Substack writers who said they did not feel comfortable working with a platform that allows hateful rhetoric to fester or flourish.
The debate has renewed questions that have long plagued technology companies and social media platforms about how content should be moderated, if at all.
Substack, which takes a 10 percent cut of revenue from writers who charge for newsletter subscriptions, has faced similar criticism in the past, particularly after it allowed transphobic and anti-vaccine language from some writers.
Nikki Usher, a professor of communication at the University of San Diego, said that many platforms are confronting what is known as “the Nazi problem,” which stipulates that if an online forum is available for long enough, there are going to be extremists there at some point.
Substack is establishing itself as a neutral provider of content, Professor Usher said, but that also sends a message: “We’re not going to try to police this problem because it’s complicated, so it’s easier to not take a position.”
More than 200 writers who publish newsletters on Substack have signed a letter opposing the company’s passive approach.
“Why do you choose to promote and allow the monetization of sites that traffic in white nationalism?” the letter said.
The writers also asked if part of the company’s vision for success included giving hateful people, such as Richard Spencer, a prominent white nationalist, a platform.
“Let us know,” the letter said. “From there we can each decide if this is still where we want to be.”
Some popular writers on the platform have already promised to leave. Rudy Foster, who has more than 40,000 subscribers, wrote on Dec. 14 that readers often tell her they “can’t stand to pay Substack anymore,” and that she feels the same.
“So here’s to a 2024 where none of us do that!” she wrote.
Other writers have defended the company. A letter signed by roughly 100 Substack writers says that it is better to let the writers and readers moderate content, not social media companies.
Elle Griffin, who has more than 13,000 subscribers on Substack, wrote in the letter that while “there is a lot of hateful content on the internet,” Substack has “come up with the best solution yet: Giving writers and readers the freedom of speech without surfacing that speech to the masses.”
She argued that subscribers receive only the newsletters they sign up for, so it is unlikely that they will receive hateful content unless they follow it. That is not the case on X and Facebook, Ms. Griffin said.
She and the others who signed the letter supporting the company emphasized that Substack is not really one platform, but thousands of individualized platforms with unique and curated cultures.
Alexander Hellene, who writes sci-fi and fantasy stories, signed Ms. Griffin’s letter. In a post on Substack, he said that a better approach to content moderation was “to take things into your own hands.”
“Be an adult,” he wrote. “Block people.”
In his statement, Mr. McKenzie, the Substack co-founder, also defended his decision to host Richard Hanania, the president of the Center for the Study of Partisanship and Ideology, on the Substack podcast “The Active Voice.” The Atlantic reported that Mr. Hanania had previously described Black people on social media as “animals” who should be subject to “more policing, incarceration, and surveillance.”
“Hanania is an influential voice for some in U.S. politics,” Mr. McKenzie wrote, adding that “there is value in knowing his arguments.” He said he was not aware of Mr. Hanania’s writings at the time.
Mr. McKenzie also argued in his statement that censorship of ideas that are considered to be hateful only makes them spread.
But research in recent years suggests the opposite is true.
“Deplatforming does seem to have a positive effect on diminishing the spread of far-right propaganda and Nazi content,” said Kurt Braddock, a professor of communication at American University who has researched violent extremist groups.
When extremists are removed from a platform, they often go to another platform, but much of their audience does not follow them and their incomes are eventually diminished, Professor Braddock said.
“I can appreciate somebody’s dedication to freedom of speech rights, but freedom of speech rights are dictated by the government,” he said, noting that businesses can choose the types of content they host or prohibit.
While Substack says it does not allow users to call for violence, even that distinction can be murky, Professor Braddock said, because racists and extremists can walk up to the line without overtly doing that. But their rhetoric can still inspire others to violence, he said.
Allowing Nazi rhetoric on a platform also normalizes it, he said.
“The more they use the kind of rhetoric that dehumanizes or demonizes a certain population,” Professor Braddock said, “the more it becomes OK for the general population to follow.”