Skip Navigation

Lemmyshitpost community closed until further notice

Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won't help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn't his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what's next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It's been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn't the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

701 comments
  • This is seriously sad and awful that people would go this far to derail a community. It makes me concerned for other communities as well. Since they have succeeded in having shitpost closed does this mean they will just move on to the next community? That being said here is some very useful information on the subject and what can be done to help curb CSAM.

    The National Center for Missing & Exploited Children (NCMEC) CyberTipline: You can report CSAM to the CyberTipline online or by calling 1-800-843-5678. Your report will be forwarded to a law enforcement agency for investigation. The National Sexual Assault Hotline: If you or someone you know has been sexually assaulted, you can call the National Sexual Assault Hotline at 800-656-HOPE (4673) or chat online. The hotline is available 24/7 and provides free, confidential support.

    The National Child Abuse Hotline: If you suspect child abuse, you can call the National Child Abuse Hotline at 800-4-A-CHILD (422-4453). The hotline is available 24/7 and provides free, confidential support. Thorn: Thorn is a non-profit organization that works to fight child sexual abuse. They provide resources on how to prevent CSAM and how to report it.

    Stop It Now!: Stop It Now! is an organization that works to prevent child sexual abuse. They provide resources on how to talk to children about sexual abuse and how to report it.

    Childhelp USA: Childhelp USA is a non-profit organization that provides crisis intervention and prevention services to children and families. They have a 24/7 hotline at 1-800-422-4453. Here are some tips to prevent CSAM:

    Talk to your children about online safety and the dangers of CSAM.

    Teach your children about the importance of keeping their personal information private. Monitor your children's online activity.

    Be aware of the signs of CSAM, such as children being secretive or withdrawn, or having changes in their behavior. Report any suspected CSAM to the authorities immediately.

  • The amount of people in these comments asking the mods not to cave is bonkers.

    This isn’t Reddit. These are hobbyists without legal teams to a) fend off false allegations or b) comply with laws that they don’t have any deep understanding of.

    • Yeah, you've got to think of this place like the big forums of 20 years ago, they're just run by a tiny handful of regular people, and having to deal with solicitors and other such stuff is entirely out of the question.

      If something's bad, you lock it down and purge it until it's not bad any more. Unfortunately that's the best you can do with such minimal resources as a regular member of the public, and for those that don't like it, there's other forums out there.

      This isn't one single huge monopoly thing like Reddit, you either stay or leave forever, if you don't like how one is being run, just sign up on a different one. Takes the stress out of it :-)

  • Fucking bastards. I don't even know what beef they have with the community and why, but using THAT method to get them to shut down is nothing short of despicable. What absolute scum.

  • We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

    It's likely that we'll be seeing a large number of instances switch to whitelist based federation instead of the current blacklist based one, especially for niche instances that does not want to deal with this at all (and I don't blame them).

  • How does closing lemmyshitpost do anything to solve the issue? Isn't it a foregone conclusion that the offenders would just start targeting other communities or was there something unique about lemmyshitpost that made it more susceptible?

    • It stops their instance hosting CSAM and removes their legal liability to deal with something they don't have the capacity to at this point in time.

      How would you respond to having someone else forcibly load up your pc with child porn over the Internet? Would you take it offline?

      • How would you respond to having someone else forcibly load up your pc with child porn over the Internet? Would you take it offline?

        But that's not what happened. They didn't take the server offline. They banned a community. If some remote person had access to my pc and they were loading it up with child porn, I would not expect that deleting the folder would fix the problem. So I don't understand what your analogy is trying to accomplish because it's faulty.

        Also, I think you are confusing my question as some kind of disapproval. It isn't. If closing a community solves the problem then I fully support the admin team actions.

        I'm just questioning whether that really solves the problem or not. It was a community created on Lemmy.world, not some other instance. So if the perpetrators were capable of posting to it, they are capable of posting to any community on lemmy.world. You get that, yeah?

        My question is just a request for clarification. How does shutting down 1 community stop the perpetrators from posting the same stuff to other communities?

    • They also changed the account sign ups to be application only so people can't create accounts without being approved.

    • It doesn’t solve the bigger moderation problem, but it solves the immediate issue for the mods who don’t want to go to jail for modding a community hosting CSM.

      • Doesn't that send a clear message to the perpetrators that they can cause any community to be shut down and killed and all they have to do is post CSAM to it? What makes you or anyone else think that, upon seeing that lemmyshitpost is gone, that the perpetrators will all just quit. Was lemmyshitpost the only community they were able to post in?

  • Is it possible to (at least temporarily):

    1. Turn off instance image hosting (disable pictrs)
    2. Disallow image and video posts across all communities
    3. As in Firefish, turn off caching of remote images from other instances.

    whilst longer term solutions are sought? This would at least ensure poor mods aren't exposed to this shit and an instance could be more positive they're not inadvertently hosting CSAM.

  • Thank you for your work to keep that despicable trash out of our feeds. Sorry you have to deal with it. Fuck those losers.

  • These bad actors are wierd, most who spread CSAM dont want to be known about by the masses, these posts are implied to be attempting to be visible as their posting in a well known community. It feels Sus.

    Keep up the good work keeping us safe, keep your therapist and/or support network in the loop about this so they can catch you if you fall.

  • Isn't there semi-automated tools that can detect CP?

    Those might be an automated solution to at least cut down on the volume.

    The same can go for banned images. These can automatically be identified with perceptual hashing, and automatically be denied when uploading.

  • Just saw this: That is fucking awful. Thank you for your service to the community and for protecting us from seeing that shit.

  • Sincerely appreciate your work to better this instance and the fediverse in it's entirely.

701 comments