Sorry for the short post, I’m not able to make it nice with full context at the moment, but I want to quickly get this announcement out to prevent confusion:

Unfortunately, people are uploading child sexual abuse images on some instances (apparently as a form of attack against Lemmy). I am taking some steps to prevent such content from making it onto lemm.ee servers. As one preventative measure, I am disabling all image uploads on lemm.ee until further notice - this is to ensure that lemm.ee can not be used as gateway to spread CSAM into the network.

It will not possible to upload any new avatars or banners while this limit is in effect.

I’m really sorry for the disruption, it’s a necessary trade-off for now until we figure out the way forward.

  • ScrollinMyDayAway@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 years ago

    This is sick. Kudos to mods for dealing with this garbage. I hope the posters are all hunted down and punished.

    • DudePluto@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      2 years ago

      Yeah, the admins deserve all our support on this. Not only to protect themselves as server owners, but to stop the spread. Hopefully a longterm solution will be found soon

      • We doubled the amount of mods, and banned anything remotely resembling the things on-site. Sadly many times it had to be a brave lemmygrad to check it first and take the bullet for us to report it. I was one of those people on several occasions. I still cringe at the memories. It lasted a few months iirc.I haven’t seen whatever is hitting you guys, but our bots had some recognizable features, usually hiding their spam behind spoilers or links.

        It really was just a mobilization, lockdown, and purging everything that was suspicious until it stopped. That or they found a way to block those bots. I wasn’t in the command center by any means so the internal decisions I don’t know too much about.

  • Awoo [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    If you’re concerned about legal liability I think it’s worth noting that there is some protection for websites in this matter. For the most part as long as you’re taking “reasonable action” against it you’re not liable, and that most laws take into consideration the resources of the site dealing with the uploads.

    Not pleasant for users though of course. And the speed at which its handled is obviously a concern.

  • TheAndrewBrown@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    I think this is a great move until we have something rock solid to prevent this. There are tons of image hosting sites you can use (most of which have the resources to already try to prevent this stuff) so it shouldn’t really cause much inconvenience.

  • Cris@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    2 years ago

    I know there are automated tools that exist for detection CSAM- given the challenges the fediverse has had with this issue it really feels like it’d be worthwhile for the folks developing platforms like lemmy and mastodon to start thinking about how to integrate those tools with their platforms to better support moderators and folks running instances.