Skip to content

Why I will not publish on Substack

One phenomenon of social information tech startups like Substack is that their explosive growth is directly proportional to the amount of venture capital injected into them. That's why you'll find everyone from journalists and activists to propagandists and politicians publishing their work on the paid newsletter platform: it's widely used which makes it widely available, and it's widely available which makes it widely used.

After announcing on TikTok that I was going to launch this blog, by far the most popular response was whether it was going to be on Substack. The answer (if you haven't figured that out yet) is a resounding no—which boils down to two reasons:

  • When Substack's CEO, Chris Best, was interviewed by Nilay Patel for the Decoder podcast, he went out of his way to not answer the host's question about whether a specific, racist statement would be allowed on the platform.
  • A week later, a second co-founder of Substack, Hamish McKenzie, tried to clear the air by saying they don't condone racism. Neither of these two men, however, answered the question about whether Substack would allow racist bigotry to be published. (The question was never about whether Substack's founders condone racism, it was about whether Substack would allow racism to be published on their platform.)

Substack's founders feel that they can combat the implicit problems with algorithmically curated content by employing paid newsletters that put agency in the hands of the authors who charge for their content. That's their prerogative. To me, though, they are doing what affluent tech startup people do when they look at problems through the privileged double-lens of technology and venture capital: confusing money with free speech.

How did we get here?

Substack appears and operates in the ways that many other tech startups in the social information space appear and operate—with an opaque handful of private investors like Andreesen Horowitz, whose manifesto about technology and society is exactly the kind of perspective you would expect from white billionaires supporting Donald Trump. Substack has an added bonus, though: its core revenue model is based around paid newsletter subscriptions.

But who is subscribing to these newsletters? An interesting question, when you think about it, since there's really no way to tell whether paid subscribers are actual humans, elements of a anti-American disinformation campaigns, and/or out-in-the-open white supremacists.

(You would think that the CEO of such an enterprise would use publicly recorded interviews as an opportunity to clarify that their platform is in fact not a global distribution center for alt-right propaganda...)

👉 In 2019, Substack raised about $20M in funding to build a platform that helped people subscribe to paid newsletters.

👇 Four years later, it raised an additional $65M and announced the Substack Notes feature—a short-form content addition to the platform's ecosystem.

👈 That same year, Nijay Patel interviewed Substack's CEO Chris Best, and asked whether the statement "We should not allow brown people in the country" (which Patel pointed out appeared to be in violation of their Terms of Service) would be moderated.

Snippet from the interview (transcript)

(Patel): Wait. Hold on. In America in 2023, that is not so extreme, right? “We should not allow as many brown people in the country.” Not so extreme. Do you allow that on Substack? Would you allow that on Substack Notes?

(Best): I think the way that we think about this is we want to put the writers and the readers in charge–

No, I really want you to answer that question. Is that allowed on Substack Notes? “We should not allow brown people in the country.”

I’m not going to get into gotcha content moderation.

This is not a gotcha… I’m a brown person. Do you think people on Substack should say I should get kicked out of the country?

I’m not going to engage in content moderation, “Would you or won’t you this or that?”

That one is black and white, and I just want to be clear: I’ve talked to a lot of social network CEOs, and they would have no hesitation telling me that that was against their moderation rules.

Yeah. We’re not going to get into specific “would you or won’t you” content moderation questions.

Why?

I don’t think it’s a useful way to talk about this stuff.

But it’s the thing that you have to do. I mean, you have to make these decisions, don’t you?

The way that we think about this is, yes, there is going to be a terms of service. We have content policies that are deliberately tuned to allow lots of things that we disagree with, that we strongly disagree with. We think we have a strong commitment to freedom of speech, freedom of the press. We think these are essential ingredients in a free society. We think that it would be a failure for us to build a new kind of network that can’t support those ideals. And we want to design the network in a way where people are in control of their experience, where they’re able to do that stuff. We’re at the very early innings of that. We don’t have all the answers for how those things will work. We are making a new thing. And literally, we launched this thing one day ago. We’re going to have to figure a lot of this stuff out. I don’t think…

You have to figure out, “Should we allow overt racism on Substack Notes?” You have to figure that out.

No, I’m not going to engage in speculation or specific “would you allow this or that” content.

You know this is a very bad response to this question, right? You’re aware that you’ve blundered into this. You should just say no. And I’m wondering what’s keeping you from just saying no.

I have a blanket [policy that] I don’t think it’s useful to get into “would you allow this or that thing on Substack.”

If I read you your own terms of service, will you agree that this prohibition is in that terms of service?

I don’t think that’s a useful exercise.

☝️ At the end of 2023, Substack co-founder Hamish McKenzie told the world that moderating Nazi rhetoric is not fair to Nazis:

I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don't think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.
We believe that supporting individual rights and civil liberties while subjecting ideas to open discourse is the best way to strip bad ideas of their power. We are committed to upholding and protecting freedom of expression, even when it hurts.

The comments on that post are worth the read, if nothing else than to assure you that the irony of all this is alive and well—that cries of censorship tend to emanate from those who seek power to censor others.

(Personally, I'm disappointed in McKenzie as a fellow English major. Tolerating the intolerant has never been a takeaway from the last two decades of conversations among moderation communities of practice; I assign him a two-page reflection paper on The Parable of the Nazi Bar.)

I do not agree with Chris and Hamish's thesis that we can combat misinformation and disinformation by letting people decide who to spend money on. We don't have to look far to see how equating speech and money exacerbates existing information-related problems (see Citizens United v. FEC). They can wax poetic about society having a trust problem that cannot be fixed by censorship all they want, but the reality is that while the founders look upon us plebs as mere simpletons who don't understand how important it is to the free exchange of ideas that these millionaires provide a platform for Nazis to broadcast Nazi propaganda while generating revenue, everyone else knows what they're doing.

I could talk about audience growth, about marketing and publication strategy, and about the many potential readers who might have discovered my work had I just done what most other people do and follow the path of least resistance and just publish my work on Substack. But just as the Trolley Problem implores us to question whether our decisions are made in spite of or because of our moral and ethical obligations, so too must I question whether my knowledge of this space (content moderation on social media; venture capital; technology startups; the limited worldviews of rich men in tech) compels me to resist the temptation of exactly what seems to motivate their positions that I disagree with—which is to follow the route of greatest potential revenue.

Comments

Latest