Skip Navigation

Substack says it will not remove or demonetize Nazi content

94 comments
  • There's a lot of empirical claims surrounding this topic, and I'm unaware who really has good evidence for them. The Substack guy e.g. is claiming that banning or demonetising would not "solve the problem" – how do we really know? At the very least, you'd think that demonetising helps to some extent, because if it's not profitable to spread certain racist ideas, that's simply less of an incentive. On the other hand, plenty of people on this thread are suggesting it does help address the problem, pointing to Reddit and other cases – but I don't think anyone really has a grip on the empirical relationship between banning/demonetising, shifting ideologues to darker corners of the internet and what impact their ideas ultimately have. And you'd think the relationship wouldn't be straightforward either – there might be some general patterns but it could vary according to so many contingent and contextual factors.

    • I agree it's murky. Though I'd like to note that when you shift hateful ideologues to dark corners of the internet, that also means making space in the main forums for people who would otherwise be forced out by the aforementioned ideologues - women, trans folks, BIPOC folks, anyone who would like to discuss xyz topic but not at the cost of the distress that results from sharing a space with hateful actors.

      When the worst of the internet is given free reign to run rampant, it has a tendency to take over the space entirely with hate speech because everything and everyone else leaves instead of putting up with abuse, and those who do stay get stuck having the same, rock bottom level conversations (e.g. those in which the targets of the hate are asked to justify their existence or presence or right to have opinions repeatedly) over and over with people who aren't really interested in intellectual discussions or solving actual problems or making art that isn't about hatred.

      But yeah, as with anything involving large groups of people, these things get complicated and can be unpredictable.

      • Thank you! Even on lemmy I find the atmosphere often oblivious or ignorant to marginalized views. The majority here are cis men (regarding the poll earlier this year) and it certainly shows. And the people here are probably mostly left-leaning? So I definitely couldn't imagine sharing a space with anyone more right-leaning than that.

    • The problem when you own a space that if you let certain groups of people in, such as, in this example, Nazis, you'll literally drive everyone else away from your space, so that what started off as a normal, ordinary space will become, essentially, a Nazi bar.

      It's not only Nazis — it can be fascists, white supremacists, meth-heads, PUAs, cryptocurrency fanboys — some groups will be so odious to others that they will drive everyone else from your space, so the only solution that you can enact is to ensure that they don't come to your place, even if they're nice and polite and "follow your rules", because while they might, their friends won't, those friends have a history of driving away other people from other spaces.

      "you have to nip it in the bud immediately. These guys come in and it's always a nice, polite one. And you serve them because you don't want to cause a scene. And then they become a regular and after awhile they bring a friend. And that dude is cool too.

      And then THEY bring friends and the friends bring friends and they stop being cool and then you realize, oh shit, this is a Nazi bar now. And it's too late because they're entrenched and if you try to kick them out, they cause a PROBLEM. So you have to shut them down.

    • What evidence did you find to support Substack’s claims? They didn’t share any.

      You can quickly and easily find good evidence for things like Reddit quarantining and the banning of folks like Alex Jones and Milo Yiannopoulos.

      Which claims are empirical again?

    • There’s a lot of empirical claims surrounding this topic, and I’m unaware who really has good evidence for them. The Substack guy e.g. is claiming that banning or demonetising would not “solve the problem” – how do we really know?

      Well it depends what you define as "the problem".

      If you define it as Nazis existing per se, banning them does not "solve the problem" of nazis existing. They will just go elsewhere. A whole world war was not enough to get rid of them.

      However, allowing them on mainstream platforms does make their views more prevalent to mainstream users and some might fall for their propaganda similar to the way people get attracted to the Qanon nonsense. So if you define the problem as "Nazis gaining attention" then yeah sure. It certainly does "solve the problem" to some degree. And I think this is the main problem these days (even in the Netherlands which is a fairly down to earth country, the fascists gained 24% of the votes in the last election!)

      However however you define "the problem" making money off nazi propaganda is just simply very very bad form. And will lead to many mainstream users bugging out, and rightly so.

    • we also do know that going after nazis and white supremacists works since all through the 90s they were relegated to the fringe of the fringe corners on the internet.

  • There are too many of these goddamned social networks anyway. After Twitter/X exploded, everyone else wanted to grab a piece of that pie, and now we've got a dozen social networks nobody uses.

    If you want a progressive social network that doesn't take shit from goosesteppers, Cohost is probably the place to go. It's so neurodivergent and trans-friendly that I can't imagine them blithely accepting Nazi content. It's just not how Cohost works. "Blah blah blah, free speech!" Not here, chumps. We've got standards. Go somewhere else to push that poison.

  • So they are complacent with it very well. If you are complacent with Nazis, to me, you're a Nazi. I don't give a shit. What's the saying that the Germans have? Like there are six guys at a table in a bar and one of them is a Nazis, therefore there are six Nazis at the table? Yeah, that.

  • As always, there are several different aspects:

    • Promoting [Nazi propaganda and misinformation]
    • Laughing at [...]
    • Analyzing [...]
    • Profiting from [...]

    Sometimes the difference between "promotion", "laughing at", and "analysis", depends more on the reader's approach, that on the writer's intent.

    Then again, sometimes a reader decides they don't want to deal with any of it, which is also respectable.

    • Look. If there are 9 people at a table sitting with 1 Nazi and hanging out, there are 10 Nazis.

      • That's the quick, easy, and wrong approach.

        What are 9 non-Nazi people supposed to do: kick the 1 Nazi to a Nazi-only table? Leave the table and now have 2 Nazi-only tables? Get everyone thrown out?

        Nazism works like any other sect; what converted people need, is exposure to other ways of thinking, ideally some human connections with people whom the sect demonizes and tries to keep members away from. Pushing sect members away from society, is precisely what the sect wants!

        I'm not saying that you personally, or even me, should be the ones to do that, or that we should idly watch, or not have a Nazi-free table.

        What I'm saying is that non-Nazis putting up with a Nazi in order to de-program them, should be praised, and that you can't tell what's really going on just by watching who sits where.

  • 🤖 I'm a bot that provides automatic summaries for articles: ::: spoiler Click here to see the summary While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation.

    In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions.

    “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said.

    In a 2020 letter from Substack leaders, including Best and McKenzie, the company wrote, “We just disagree with those who would seek to tightly constrain the bounds of acceptable discourse.”

    The Atlantic also pointed out an episode of McKenzie’s podcast with a guest, Richard Hanania, who has published racist views under a pseudonym.

    McKenzie does, however, cite another Substack author who describes its approach to extremism as one that is “working the best.” What it’s being compared to, or by what measure, is left up to the reader’s interpretation.


    Saved 57% of original text. :::

  • i never used substack before and they are doing a good job making sure i never do. i hope they like being the nazi bar.

94 comments