AI-created child sexual abuse images ‘threaten to overwhelm internet’::Internet Watch Foundation finds 3,000 AI-made abuse images breaking UK law

  • @EatYouWell@lemmy.world
    link
    fedilink
    English
    201 year ago

    Honestly, even though I find the idea abhorrent, if it prevents actual children from being abused…

    I mean, the content is going to be generated one way or another.

    • HeavyDogFeet
      link
      fedilink
      English
      -1
      edit-2
      1 year ago

      Does it? Or is it just bonus content for pedophiles? Just because they’re now getting thing B doesn’t mean they’re not also still getting thing A. In fact, there’s nothing to suggest that this wouldn’t just make things worse. What’s to stop them from simply using it like a sandbox to test out shit they’ve been too timid to do themselves in real life? Little allowances like this are actually a pretty common way for people to build up to committing bolder crimes. It’s a textbook pattern for serial killers, what’s to say it wouldn’t serve the same purpose here?

      But hey, if it does result in less child abuse material being created, that’s great. But there’s no evidence that this is actually how it will play out. It’s just wishful thinking because people want to give generative AI the benefit of the doubt that it is a net positive for society.

      Anyway, rant over. You might be able to tell that I have strong feelings about benefit and dangers of these tools.

      • @Igloojoe@lemm.ee
        link
        fedilink
        English
        101 year ago

        Your argument sounds very similar to when people argue that video games promote violence and criminal activity.

        • HeavyDogFeet
          link
          fedilink
          English
          2
          edit-2
          1 year ago

          That quite a stretch. For a start, playing video games isn’t illegal. Generating child porn is. Graduating from something innocent to something criminal is very different to starting off at one of the more heinous crimes in modern society and then continuing to do different variations of that same crime.

      • @burliman@lemm.ee
        link
        fedilink
        English
        51 year ago

        I’m guessing they could easily support this with a simple premise: Examine a legal fetish, which AI can generate images of, and ask people who generate those images if their consumption of real images have fallen as a result. Also check if actual real life participation in it has been reduced due to the ability to generate the scenarios privately.

        It will be skewed if the fetish is legal, since participating won’t land you in jail. But there may be some out there that present other risks besides legal ones to help with that.

        • @Heavybell@lemmy.world
          link
          fedilink
          English
          11 year ago

          Where does the fact some people will happily jerk it exclusively to anime titty come into this, I wonder?

          • @Brahminman@iusearchlinux.fyi
            link
            fedilink
            English
            11 year ago

            It doesn’t? They’re a fucking edge case and if you only jerk it to anime then nobody will ever be harmed in the production of the port you watch, so nobody should care or read too much into it

    • @MTK@lemmy.world
      link
      fedilink
      English
      -61 year ago

      Also the models were trained on real images, every image these tools create are directly related to the rape of thousands or even tens of thousands of children.

      Real or not these images came from real children that were raped in the worst ways imaginable

          • @Bye@lemmy.world
            link
            fedilink
            English
            11 year ago

            You don’t need the exact content you want in order to train a model (Lora) for SD. If you train on naked adults, and clothed kids, it can make some gross shit. And there are a lot more of those safe pictures out there to use for training. I’d bet my left leg that these models were trained that way.

            • @MTK@lemmy.world
              link
              fedilink
              English
              11 year ago

              Why? If these people have access to these images why would you bet that they don’t use them?

              There are dark web sites that have huge sets of CSAM, why would these people not use that? What are you betting on? Their morals?