Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • @priapus@sh.itjust.works
    link
    fedilink
    31 year ago

    Definitions of CSAM definitely do not include illustrated and simulated forms. They do not have a victim and therefore cannot be abuse. I agree that it should not be allowed on public platforms, hence why all instances hosting it should be defederated. Despite this, it is not illegal, so reporting it to authorities is a waste of time for you and the authorities who are trying to remove and prevent actual CSAM.

    • @balls_expert@lemmy.blahaj.zone
      link
      fedilink
      1
      edit-2
      1 year ago

      CSAM definitions absolutely include illustrated and simulated forms. Just check the sources on the wikipedia link and climb your way up, you’ll see “cartoons, paintings, sculptures, …” in the wording of the protect act

      They don’t actually need a victim to be defined as such

      • @priapus@sh.itjust.works
        link
        fedilink
        11 year ago

        That Wikipedia broader is about CP, a broader topic. Practically zero authorities will include illustrated and simualated forms of CP in their definitions of CSAM

          • @priapus@sh.itjust.works
            link
            fedilink
            11 year ago

            That’s not what I was debating. I was debating whether or not it should be reported to authorities. I made it clear in my previous comment that it is disturbing and should always be defederated.

            • Ah. It depends on the jurisdiction the instance is in

              Mastodon has a lot of lolicon shit in japan-hosted instances for that reason

              Lolicon is illegal under US protect act of 2003 and in plenty of countries