• @0110010001100010@lemmy.world
      link
      fedilink
      English
      1351 year ago

      Yeah this is a scary, clickbaity headline meant to invoke a negative response about AI. AI is a tool, like a computer or pipe wrench. It can be used for good or bad.

      The sicko got what he deserved, but the AI bit is rather over-the-top.

      • livus
        link
        fedilink
        981 year ago

        The part that freaks me out is more that he was in an influential position in children’s lives and he was making images of the specific children who were his patients.

        • @Paradachshund@lemmy.today
          link
          fedilink
          English
          491 year ago

          This is unfortunately not that uncommon. Pedos often work in child focused jobs. Very disturbing, and that’s why background checks are important in those fields.

          • @vacuumflower@lemmy.sdf.org
            link
            fedilink
            English
            361 year ago

            Not only pedos, in general sadistic people tend to try and get jobs which would give them the feeling of power over someone, and not all of them can be dictators, warlords, just politicians, even lowly prison guards or policemen, also cowardice is a factor. So - child-related jobs.

            But, to be frank, I’m not sure background checks are going to do that much good. People of this kind tend to bunch together, help each other, and can either get past the radar rather easy or utilize these checks to discredit anybody who’d be a threat to them.

            It’s a complex matter.

          • @afraid_of_zombies@lemmy.world
            link
            fedilink
            English
            51 year ago

            Which makes you wonder why religious groups can get around the requirements. I am actually not against say the Church of LDS spending money to provide free therapy for children I just want those therapists held to the same standards we hold regular therapists too. Which includes sexual background screenings.

          • @jasondj@ttrpg.network
            link
            fedilink
            English
            2
            edit-2
            1 year ago

            Is it honestly that surprising? Just because they are sexually attracted to kids does not mean they cannot love kids on an emotional level. I don’t think it’s impossible that there would be pedophiles who both love children and recognize that sexual and intimate contact is reprehensible.

            Put differently, I would much rather hear “child psychiatrist caught with computer-generated CSAM modeled after his patients” than “child psychiatrist caught with nude photos of his patients” or “child psychiatrist charged with sexual assault of a minor”. Comparatively speaking, the first is really just computer-assisted thoughtcrime, while the others mean there was actual direct harm to a child.

            Although in this particular instance, child psychiatrist is a bit too close to the child, in my opinion.

        • @0110010001100010@lemmy.world
          link
          fedilink
          English
          20
          edit-2
          1 year ago

          Exactly, it’s not the AI bit, it’s the rest of the story about how this dude was in a position of power to exploit children (and did so) that’s just fucking sick.

      • Ænima
        link
        fedilink
        English
        -91 year ago

        Username totally checks out. Definitely not AI or a bot.

    • @Ziglin@lemmy.world
      link
      fedilink
      English
      -701 year ago

      How come you are already using a short form, how often do you talk about this kind of thing???

      • Nfamwap
        link
        fedilink
        English
        571 year ago

        Its pretty well established what CP stands for.

          • @SaakoPaahtaa@lemmy.world
            link
            fedilink
            English
            101 year ago

            CSAM sounds like a cool weapons system, I dont want to tarnish that image nor believe that theres anyone out there who thinks “porn” implies consent, be it about kids or not

          • @LazyBane@lemmy.world
            link
            fedilink
            English
            41 year ago

            I think it’s more to include anything that’s sexually abusive instead of what is just pornograthic.