• I don’t understand why using AI is what makes this illegal. I don’t know the laws in Spain, but would it be illegal if they used a pencil or a paint brush? Seems like a weird line to draw if not.

    • @Ava@beehaw.org
      link
      fedilink
      48
      edit-2
      4 months ago

      The minors were charged with 20 counts of creating child sex abuse images and 20 counts of offenses against their victims’ moral integrity.

      The article doesn’t make the claim that the AI is what makes it illegal, simply that AI was used. It’s literally the second sentence. Indeed, it goes on to highlight that there are legal novelties prosecuting the use of AI.

    • @Darkassassin07@lemmy.ca
      link
      fedilink
      English
      304 months ago

      As far as I understand; it’s not the tools used that makes this illegal, but the realism/accuracy of the final product regardless of how it was produced.

      If you were to have a high proficiency with manual Photoshop and produced similar quality fakes, you’d be committing the same crime(s)

      creating child sex abuse images

      and

      offenses against their victims’ moral integrity

      The thing is, AI tools are becoming more and more accessible to teens. Time, effort, and skill are no longer roadblocks to creating these images; which leaves very very little in an irresponsible teenagers way…

      • @CanadaPlus@lemmy.sdf.org
        link
        fedilink
        5
        edit-2
        4 months ago

        Which still seems kinda dumb. How realistic is too realistic? You could make a legal standard of “photography-like”, or something, just to define who to convict, but you still haven’t really justified why.

        The sentence in this case is just classes, though, so I’ll leave my pitchfork in the shed.

        • @Darkassassin07@lemmy.ca
          link
          fedilink
          English
          204 months ago

          Did… Did you just ask; why creating photo-realistic sexually explicit material of real children, should be illegal?

          • @CanadaPlus@lemmy.sdf.org
            link
            fedilink
            12
            edit-2
            4 months ago

            Keep in mind these were other kids their age. We’re not talking about pedo stuff here.

            All the recent stuff about deepfakes feels a bit moral-panic-y to me. I think we should have a better reason than just ick before anyone gets thrown in jail.

            • @Kissaki@beehaw.org
              link
              fedilink
              English
              174 months ago

              We’re not talking about pedo stuff here.

              Do you want an explanation of why creating and sharing sexually explicit material of other people without consent is problematic and damaging, and especially for children?

              • Eggyhead
                link
                fedilink
                54 months ago

                This is a really good idea. Perhaps this is what should be happening in the first place rather than resorting to direct legal enforcement, which can be problematic and damaging, especially for children.

                • Zoot
                  link
                  fedilink
                  94 months ago

                  If you cant understand that sharing naked photos of people is bad, then you probably should have to face the court systems.

                  Like what? I don’t care how horny you are as a teenager, it takes a real fucking idiot, and a huge shitstain to go and share those photos. They absolutely deserve the book being thrown at them.

              • @CanadaPlus@lemmy.sdf.org
                link
                fedilink
                14 months ago

                Yes.

                I can see why we’d prohibit it, but somehow doing it in writing without involving the subject is pretty accepted (see: every fanfiction involving characters played by a specific real actor), and mentally doing it is like an informal human right.

                I’m honestly not trying to be obtuse here. It seems arbitrary to me. People have pictured me in all kinds of horrifying situations, I’m sure (probably more violent then sexy, but still). I’m not bothered, nor would I be if they made a collection of depictions (unless they sent some to me).

                • @Kissaki@beehaw.org
                  link
                  fedilink
                  English
                  5
                  edit-2
                  4 months ago

                  They shared sexually explicit images in whatsapp groups. You consider that similar to having personal thoughts nobody will know of or written stories?

                  “were completely terrified and had tremendous anxiety attacks because they were suffering this in silence.”

                  Have you dismissed this quote? I don’t know where to start explaining how it’s different from what you described because of how far off it is. I have no idea where the baseline is to argue from.

                  Humans are a social creature. We form groups, and want to be part of groups. Teens are especially vulnerable with a developing personality, social norms, and social belonging. Breaking norms and violating common personal barriers and control of self-expression and self-presentation is deeply violating in a vulnerable phase of life.

                  They didn’t create a personal collection. They shared in their social groups.