Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • @redcalcium@lemmy.institute
    link
    fedilink
    341 year ago

    I get what you’re saying, but due to federated nature, those CSAMs can easily spread to many instances without their admins noticing them. Having even one CSAM in your server is a huge risk for the server owner.

    • MinusPi (she/they)
      link
      fedilink
      281 year ago

      I don’t see what a server admin can do about it other than defederate the instant they get reports. Otherwise how can they possibly know?

      • krimsonbun
        link
        fedilink
        -51 year ago

        This could be a really big issue though. People can make instances for really hateful and disgusting crap but even if everyone defederates from them it’s still giving them a platform, a tiny tiny corner on the internet to talk about truly horrible topics.

        • @andruid@lemmy.ml
          link
          fedilink
          151 year ago

          Again if it’s illegal content publically available, officials can charge those site admins with crime of hosting. Everyone just has a duty to defederate.

        • @priapus@sh.itjust.works
          link
          fedilink
          131 year ago

          Those corners will exist no matter what service they use and there is nothing Mastodon can do to stop this. There’s a reason there are public lists of instances to defederate. This content can only be prevented by domain providers and governments.