Skip Navigation

Lemmyshitpost community closed until further notice

Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won't help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn't his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what's next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It's been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn't the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

705 comments
  • This is seriously sad and awful that people would go this far to derail a community. It makes me concerned for other communities as well. Since they have succeeded in having shitpost closed does this mean they will just move on to the next community? That being said here is some very useful information on the subject and what can be done to help curb CSAM.

    The National Center for Missing & Exploited Children (NCMEC) CyberTipline: You can report CSAM to the CyberTipline online or by calling 1-800-843-5678. Your report will be forwarded to a law enforcement agency for investigation. The National Sexual Assault Hotline: If you or someone you know has been sexually assaulted, you can call the National Sexual Assault Hotline at 800-656-HOPE (4673) or chat online. The hotline is available 24/7 and provides free, confidential support.

    The National Child Abuse Hotline: If you suspect child abuse, you can call the National Child Abuse Hotline at 800-4-A-CHILD (422-4453). The hotline is available 24/7 and provides free, confidential support. Thorn: Thorn is a non-profit organization that works to fight child sexual abuse. They provide resources on how to prevent CSAM and how to report it.

    Stop It Now!: Stop It Now! is an organization that works to prevent child sexual abuse. They provide resources on how to talk to children about sexual abuse and how to report it.

    Childhelp USA: Childhelp USA is a non-profit organization that provides crisis intervention and prevention services to children and families. They have a 24/7 hotline at 1-800-422-4453. Here are some tips to prevent CSAM:

    Talk to your children about online safety and the dangers of CSAM.

    Teach your children about the importance of keeping their personal information private. Monitor your children's online activity.

    Be aware of the signs of CSAM, such as children being secretive or withdrawn, or having changes in their behavior. Report any suspected CSAM to the authorities immediately.

  • The amount of people in these comments asking the mods not to cave is bonkers.

    This isn’t Reddit. These are hobbyists without legal teams to a) fend off false allegations or b) comply with laws that they don’t have any deep understanding of.

  • Fucking bastards. I don't even know what beef they have with the community and why, but using THAT method to get them to shut down is nothing short of despicable. What absolute scum.

  • We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

    It's likely that we'll be seeing a large number of instances switch to whitelist based federation instead of the current blacklist based one, especially for niche instances that does not want to deal with this at all (and I don't blame them).

  • How does closing lemmyshitpost do anything to solve the issue? Isn't it a foregone conclusion that the offenders would just start targeting other communities or was there something unique about lemmyshitpost that made it more susceptible?

    • It stops their instance hosting CSAM and removes their legal liability to deal with something they don't have the capacity to at this point in time.

      How would you respond to having someone else forcibly load up your pc with child porn over the Internet? Would you take it offline?

      • How would you respond to having someone else forcibly load up your pc with child porn over the Internet? Would you take it offline?

        But that's not what happened. They didn't take the server offline. They banned a community. If some remote person had access to my pc and they were loading it up with child porn, I would not expect that deleting the folder would fix the problem. So I don't understand what your analogy is trying to accomplish because it's faulty.

        Also, I think you are confusing my question as some kind of disapproval. It isn't. If closing a community solves the problem then I fully support the admin team actions.

        I'm just questioning whether that really solves the problem or not. It was a community created on Lemmy.world, not some other instance. So if the perpetrators were capable of posting to it, they are capable of posting to any community on lemmy.world. You get that, yeah?

        My question is just a request for clarification. How does shutting down 1 community stop the perpetrators from posting the same stuff to other communities?

    • They also changed the account sign ups to be application only so people can't create accounts without being approved.

    • It doesn’t solve the bigger moderation problem, but it solves the immediate issue for the mods who don’t want to go to jail for modding a community hosting CSM.

      • Doesn't that send a clear message to the perpetrators that they can cause any community to be shut down and killed and all they have to do is post CSAM to it? What makes you or anyone else think that, upon seeing that lemmyshitpost is gone, that the perpetrators will all just quit. Was lemmyshitpost the only community they were able to post in?

  • Is it possible to (at least temporarily):

    1. Turn off instance image hosting (disable pictrs)
    2. Disallow image and video posts across all communities
    3. As in Firefish, turn off caching of remote images from other instances.

    whilst longer term solutions are sought? This would at least ensure poor mods aren't exposed to this shit and an instance could be more positive they're not inadvertently hosting CSAM.

  • Thank you for your work to keep that despicable trash out of our feeds. Sorry you have to deal with it. Fuck those losers.

  • These bad actors are wierd, most who spread CSAM dont want to be known about by the masses, these posts are implied to be attempting to be visible as their posting in a well known community. It feels Sus.

    Keep up the good work keeping us safe, keep your therapist and/or support network in the loop about this so they can catch you if you fall.

  • Isn't there semi-automated tools that can detect CP?

    Those might be an automated solution to at least cut down on the volume.

    The same can go for banned images. These can automatically be identified with perceptual hashing, and automatically be denied when uploading.

  • Just saw this: That is fucking awful. Thank you for your service to the community and for protecting us from seeing that shit.

  • Sincerely appreciate your work to better this instance and the fediverse in it's entirely.

  • This whole situation is shitty all around, I was really hoping it would be an isolated incident and that they wouldn't come back. I think I speak for everyone when I say that the bastards posting CSAM need to be jailed or worse. Disagree with an instance or community all you want, the instant you pull something like this, you've lost every single argument and are irredeemably a horrible person not worthy of touching a computer ever again.

    Once again pedophiles ruin everything nice. I know me saying this isn't that helpful since I can't do anything about it, but I'm sorry this is happening on your instance (and is getting federated to other instances). Don't worry about inconvencing regular users, taking action against CSAM is far more important and any half amicable user will understand.

    Some resources for reporting CSAM if you come across it anywhere (not just Lemmy):

    US: https://www.missingkids.org/cybertipline

    Canada: https://cybertip.ca/app/en/

    International: https://www.inhope.org/

    Last but not least, a reminder that if you accidentally load CSAM on your device, in most cases it will get cached to your storage because the majority of apps and browsers employ disk caching by default. You should at the very least clear your caches and then trim and fully wipe the free space on your device (maybe also directly shred the actual files if you can do that/know how to). Also know if you have any mandatory reporting laws where you live and comply with them. (EDIT: another commenter mentioned that in some jurisdictions you might actually not be allowed to delete them immediately and, presumably, have to contact police immediately reporting the active file on your device.) CYA to prevent yourself from getting screwed because of someone else's horrible acts. Also, something I've been thinking about since this whole thing started: it might also be helpful to use a no-disk-cache browser/app (or disable disk caching on your current browser/app if you are able to) if you do wish to keep using Lemmy, at least until this whole thing blows over, that way you can just close the page/program and reboot your device, and the local version should be gone, especially since flash storage devices cannot be reliably wiped with the "fill the drive with blank data" method (not sure how big the risk of it ending up in the swapfile or otherwise sticking around though or at what point it stops counting as possession). Being exposed to CSAM is a nightmare for this reason and unfortunately there seem to be no good resources on what to do if you're exposed.

    I am not a lawyer and no part of this comment is legal advice.

705 comments