In this video I discuss how generative AI technology has grown far past the governments ability to effectively control it and how the current legislative measures could lead to innocent people being jailed.

  • @mindbleach@sh.itjust.works
    link
    fedilink
    311 year ago

    There is no such thing.

    God dammit, the entire point of calling it CSAM is to distinguish photographic evidence of child rape from made-up images that make people feel icky.

    If you want them treated the same, legally - go nuts. Have that argument. But stop treating the two as the same thing, and fucking up clear discussion of the worst thing on the internet.

    You can’t generate assault. It is impossible to abuse children who do not exist.

    • @m0darn@lemmy.ca
      link
      fedilink
      291 year ago

      Did nobody in this comment section read the video at all?

      The only case mentioned by this video is a case where highschool students distributed (counterfeit) sexually explicit images of their classmates which had been generated by an AI model.

      I don’t know if it meets the definition of CSAM because the events depicted in the images are fictional, but the subjects are real.

      These children do exist, some have doubtlessly been traumatized by this. This crime has victims.

    • rurutheguru
      link
      71 year ago

      I think a lot of people are arguing that the models which are used to generate these types of content are trained on literal CSAM. So it’s like CSAM with extra steps.