• @rustyfish@lemmy.world
    link
    fedilink
    English
    775 months ago

    “Many girls were completely terrified and had tremendous anxiety attacks because they were suffering this in silence,” she told Reuters at the time. “They felt bad and were afraid to tell and be blamed for it.”

    WTF?!

    • @sigmaklimgrindset@sopuli.xyz
      link
      fedilink
      English
      555 months ago

      Spain is a pretty Catholic country, and even if religious attendance is dropping off, the ingrained beliefs can still remain. Madonna/Whore dichotomy still is very prevalent in certain parts of society there.

      • @sam@lemmy.cafe
        link
        fedilink
        English
        -15 months ago

        pretty Catholic

        I don’t know what led you to believe that, but just look at wikipedia, only 56% of the population is catholic, 37.5% being non practising (and, in my experience as a spaniard, agnostic) and 16% actually practising.

        https://en.m.wikipedia.org/wiki/Spain

        • @sigmaklimgrindset@sopuli.xyz
          link
          fedilink
          English
          15 months ago

          You read the first 6 words of my comment and just ignored the rest of it. Tell me why Holy Week is one of the biggest events in Spain even though “only” half the population is Catholic.

          The whole point I was making was that even if people identify as atheists, agnostics, or non-practicing, the remnants of the Catholic mindset and culture remain, including the misogyny inherent to most organized religions.

      • @leftzero@lemmynsfw.com
        link
        fedilink
        English
        -15 months ago

        Spain is a pretty Catholic country

        Hasn’t been in decades. No one in Spain gives a flying fuck about religion. Even the ones that go on processions and whatnot only do it because it’s traditional, and because they like dressing up and wearing silly hats.

        The kids didn’t tell because they’re kids. They thought they’d be blamed for it because kids get blamed for everything, and because they know their parents don’t understand how generative “AI” works and would believe they were actually taking naked pictures and circulating them themselves.

    • @Zak@lemmy.world
      link
      fedilink
      English
      1185 months ago

      Are you surprised by teenage boys making fake nudes of girls in their school? I’m surprised by how few of these cases have made the news.

      I don’t think there’s any way to put this cat back in the bag. We should probably work on teaching boys not to be horrible.

      • yeehaw
        link
        fedilink
        English
        365 months ago

        I’m not sure you can teach boys not to be horny teenagers 😜

        • DarkThoughts
          link
          fedilink
          625 months ago

          Being horny is one thing, sharing this stuff another. If whoever did the fake would’ve kept it to themselves, then nobody would’ve even known. The headline still is ass and typical “AI” hysteria though.

          • @lenz@lemmy.ml
            link
            fedilink
            English
            -35 months ago

            They shouldn’t have generated it in the first place. How would you feel if people did that to your mom, or you, or your sisters, or your kids?

            I don’t think just keeping it to yourself is enough.

            • DarkThoughts
              link
              fedilink
              125 months ago

              I don’t really care. People can and will fantasize in the same way about other people too and I’m not going to play thought police.

            • @BruceTwarzen@lemm.ee
              link
              fedilink
              English
              25 months ago

              Man can you imagine? Someone cutting out my moms head and glues it on a porstar? I would kill myself.

        • @Zak@lemmy.world
          link
          fedilink
          English
          615 months ago

          Having been a teenage boy myself, I wouldn’t dream of trying.

          But I knew it wasn’t OK to climb a tree with binoculars to try to catch a glimpse of the girl next door changing clothes, and I knew it wasn’t OK to touch people without their consent. I knew people who did things like that were peeping toms and rapists. I believed peeping toms and rapists would be socially ostracized and legally punished more harshly than they often are in reality.

          Making and sharing deepfakes of real people without their consent belongs on the same spectrum.

      • Pennomi
        link
        fedilink
        English
        305 months ago

        There are always two paths to take - take away all of humanity’s tools or aggressively police people who abuse them. No matter the tool (AI, computers, guns, cars, hydraulic presses) there will be somebody who abuses it, and for society to function properly we have to do something about the delinquent minority of society.

        • The Pantser
          link
          fedilink
          English
          165 months ago

          No matter the tool (AI, computers, guns, cars, hydraulic presses) there will be somebody who abuses it,

          Hydraulic press channel guy offended you somehow? I’m missing something here.

          • Pennomi
            link
            fedilink
            English
            215 months ago

            No, just an example. But if you’ve ever noticed the giant list of safety warnings on industrial machinery, you should know that every single one of those rules was written in blood.

            • @0x0@programming.dev
              link
              fedilink
              English
              35 months ago

              Either Darwin awards or assholes, most likely. Those warnings are written due to fear of lawsuit.

            • hendrik
              link
              fedilink
              English
              1
              edit-2
              5 months ago

              However this tool doesn’t have any safety warnings written on it. The App they used specifically caters for use-cases like this. They advertise to use it unmorally and we have technology to tell age from pictures for like 10 years. And they deliberately chose to have their tool generate pictures of like 13 yo girls. In the tool analogy that’s like selling a jigsaw that you’re very well aware of, misses some well established safety standards and is likely to injure someone. And it’s debatable whether it was made to cut wood anyways, or just injure people.
              And the rest fits, too. No company address, located in some country where they can’t be persecuted… They’re well aware of the use-case of their App.

          • @Ookami38@sh.itjust.works
            link
            fedilink
            English
            85 months ago

            I don’t think they’re offended. I think they’re saying that a tool is a tool. A gun or AI are only dangerous if misused, like a hydraulic press.

            We can’t go around removing the tools because some people will abuse them. Any tool can kill someone.

        • @catloaf@lemm.ee
          link
          fedilink
          English
          65 months ago

          We could also do a better job of teaching people from childhood not to be assholes.

        • @afraid_of_zombies@lemmy.world
          link
          fedilink
          English
          25 months ago

          Guns do not belong in the list. Guns are weapons, not tools. Don’t bother posting some random edge case that accounts for approximately 0.000001% of use. This is a basic category error.

          Governments should make rules banning and/or regulating weapons.

          • Pennomi
            link
            fedilink
            English
            115 months ago

            Weapons are tools, by strict definition, and there are legitimate uses for them. Besides, my point was that they should be regulated. In fact, because they are less generally useful than constructive tools, they should be regulated far MORE strictly.

            • @afraid_of_zombies@lemmy.world
              link
              fedilink
              English
              -75 months ago

              Weapons are tools,

              Prove it. Prove that the majority of people think of a gun in the same way as they do a screwdriver

              by strict definition,

              Assertion without evidence

              and there are legitimate uses for them.

              I see we didn’t read what I wrote, only the first sentence of what I wrote.

              Besides, my point was that they should be regulated. In fact, because they are less generally useful than constructive tools, they should be regulated far MORE strictly.

              By generally you mean not even close to them yes.

              • Pennomi
                link
                fedilink
                English
                95 months ago

                It seems we can’t have a reasonable discourse here because you are ignoring basic definitions. Have a lovely day!

                • @afraid_of_zombies@lemmy.world
                  link
                  fedilink
                  English
                  -85 months ago

                  No you are pulling a libertarian. You defined a word that is used a particular way to mean what you want it to mean then declare victory.

                  You are not arguing step-by-step, you are bypassing.

      • @BruceTwarzen@lemm.ee
        link
        fedilink
        English
        15 months ago

        It’s like these x-ray apps that obviously didn’t work but promoted to see all the women naked. Somehow that was very cool and no one cared. Suddenly there is something that kinda works and everyone is shocked.

      • @Evotech@lemmy.world
        link
        fedilink
        English
        -45 months ago

        Teenagers are literally retarded. Like their reasoning centers are not developed and they physically cannot think. There’s no way to teach that

  • @IllNess@infosec.pub
    link
    fedilink
    English
    395 months ago

    They are releasing stories like this to promote the new that requires adults to login to pornsites and to limit their use of it.

  • AutoTL;DRB
    link
    English
    165 months ago

    This is the best summary I could come up with:


    A court in south-west Spain has sentenced 15 schoolchildren to a year’s probation for creating and spreading AI-generated images of their female peers in a case that prompted a debate on the harmful and abusive uses of deepfake technology.

    Police began investigating the matter last year after parents in the Extremaduran town of Almendralejo reported that faked naked pictures of their daughters were being circulated on WhatsApp groups.

    Each of the defendants was handed a year’s probation and ordered to attend classes on gender and equality awareness, and on the “responsible use of technology”.

    Under Spanish law minors under 14 cannot be charged but their cases are sent to child protection services, which can force them to take part in rehabilitation courses.

    In an interview with the Guardian five months ago, the mother of one of the victims recalled her shock and disbelief when her daughter showed her one of the images.

    “Beyond this particular trial, these facts should make us reflect on the need to educate people about equality between men and women,” the association told the online newspaper ElDiario.es.


    The original article contains 431 words, the summary contains 181 words. Saved 58%. I’m a bot and I’m open source!

    • @Zeratul@lemmus.org
      link
      fedilink
      English
      35 months ago

      What does this have to do with the equality of men and women? Girls are more at risk of this kind of abuse? That’s a good point, but it’s not brought up here. This parent is trying to make something political that is simply not. Not that gender equality should be political in the first place.

  • @Sensitivezombie@lemmy.zip
    link
    fedilink
    English
    65 months ago

    Why not also go after these software companies for allowing such images to be generated. i.e allowing AI generated images of naked bodies on uploaded images to real people.

    • @Duamerthrax@lemmy.world
      link
      fedilink
      English
      45 months ago

      How? How could you make an algorithm that correctly identified what nude bodies look like? Tumblr couldn’t differentiate between nudes and sand dunes back when they enforced their new policies.

    • @KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      25 months ago

      This sounds great, but it’s one of those things that is infinitely easier to say than do. You’re essentially asking for one of two things: Manual human intervention for every single image uploaded, or “the perfect image recognition system.” And honestly, the first is fraught with its own issues, and the second does not exist.

  • @kibiz0r@midwest.social
    link
    fedilink
    English
    -75 months ago

    “Absolutely no way to prevent this”, says internet full of banners offering to “Undress your classmates now!”

    “Tools are just tools, and there’s no sense in restricting access to undress_that_ap_chemistry_hottie.exe because it wouldn’t prevent even a single case of abuse and would also destroy every legitimate use of any computer anywhere”, said user deepfake-yiff69@lemmy.dbzer0.com

    • @spamfajitas@lemmy.world
      link
      fedilink
      English
      135 months ago

      It’s possible I just haven’t come across those types of comments you’re making fun of, but I usually just see people making the case that we don’t need new, possibly overreaching, legislation to handle these situations. They want to avoid a disingenuous “think of the children” kind of situation.

      a youth court in the city of Badajoz said it had convicted the minors of 20 counts of creating child abuse images and 20 counts of offences against their victims’ moral integrity

      I’m not familiar with their legal system but I would be willing to bet the crimes they’ve committed were already illegal under existing laws.