In 2023, more deepfake abuse videos were shared than in every other year in history combined, according to an analysis by independent researcher Genevieve Oh. What used to take skillful, tech-savvy experts hours to Photoshop can now be whipped up at a moment’s notice with the help of an app. Some deepfake websites even offer tutorials on how to create AI pornography.

What happens if we don’t get this under control? It will further blur the lines between what’s real and what’s not — as politics become more and more polarized. What will happen when voters can’t separate truth from lies? And what are the stakes? As we get closer to the presidential election, democracy itself could be at risk. And, as Ocasio-Cortez points out in our conversation, it’s about much more than imaginary images.

“It’s so important to me that people understand that this is not just a form of interpersonal violence, it’s not just about the harm that’s done to the victim,” she says about nonconsensual deepfake porn. She puts down her spoon and leans forward. “Because this technology threatens to do it at scale — this is about class subjugation. It’s a subjugation of entire people. And then when you do intersect that with abortion, when you do intersect that with debates over bodily autonomy, when you are able to actively subjugate all women in society on a scale of millions, at once digitally, it’s a direct connection [with] taking their rights away.”

  • @dsemy@lemm.ee
    link
    fedilink
    English
    163 months ago

    It’s too late at this point IMO, you can make AI generated porn on your PC… How exactly are they going to stop it?

    • @RainfallSonata@lemmy.world
      link
      fedilink
      English
      173 months ago

      “The legislation amends the Violence Against Women Act so that people can sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” that the victim did not consent to those images.”–The article.

      • @dsemy@lemm.ee
        link
        fedilink
        English
        53 months ago

        I read the article… amending a law doesn’t make the problem go away.

        Maybe if more attention was given to the politicians talking about this half a decade ago (instead of focusing on AOC, which honestly realized this issue way too late), something more meaningful could have been done.

          • @umbrella@lemmy.ml
            link
            fedilink
            English
            2
            edit-2
            3 months ago

            It isn’t either/or.

            kind of, this is like doing a blame game for climate change but 30 years in the future.

            im sure dealing with it before it became so ubiquitous would have been easier.

            • DarkThoughts
              link
              fedilink
              123 months ago

              My issue with the topic is that everyone targets the wrong thing and just jumps on the media hysteria. They are not going to be able to stop the production or distribution of deepfakes, and imo they shouldn’t, because they’re basically just an advanced form of photo & video editing that already existed for decades and it did not bother anyone up until now that “AI” became a media scapegoat. What they should be bother to enforce is the illegitimate use of such material, for things like blackmail, bullying, disinformation etc. Some neckbeards wanking one out on a clearly marked deepfake porn video isn’t really going to harm the person depicted. Using such a video to claim it is real on the other hand to smear or blackmail them on the other hand is. And this type of bullying has also been going on for decades through classical photo & video manipulation, and again, it did not bother anyone up until now. And by focusing on this idiotic media hype instead of the real issue we basically make sure that it keeps on happening, just like with climate change.

              • @umbrella@lemmy.ml
                link
                fedilink
                English
                3
                edit-2
                3 months ago

                alright you have a point, although AI made this process infinetly simpler. if you have a remotely sufficient computer and some knowledge on how to operate the tools, you can do that in a matter of a few hours of actual work.

                im surprised there isnt more deepfakes.

                • DarkThoughts
                  link
                  fedilink
                  03 months ago

                  It’s usually quite a hassle to set those tools up, especially if you don’t have much technical knowledge. A lot of the more resource heavy tasks are also not really possible on a home computer and require big servers with multiple GPUs and absurd amounts of VRAM, or very specific APUs but those are still very early. The majority of what you can do at home is typically limited to generating pictures, and even there it takes quite a bit of time if you want some really high quality stuff. For a lot of more complex tasks you’re simply resource limited. And in regards to time I’m just talking for the actual generation process. Getting good results, and to the point of getting them, is another lengthy process that many people underestimate. It’s not a magic button because those LLMs are pretty damn stupid actually.

      • @starman@programming.dev
        link
        fedilink
        English
        33 months ago

        people can sue those who produce, distribute, or receive the deepfake pornography

        So can I send someone deepfake porn and then sue them?

    • @General_Effort@lemmy.world
      link
      fedilink
      English
      33 months ago

      The law creates a new kind of intellectual property, so one would expect the enforcement problems to be similar to copyright. However, there are some big differences.

      One is that the minimum damages are 150k USD + attorney’s fees/costs. That’s going to unleash quite some entrepreneurial zeal.

      To be on the hook, “possession with intent to distribute” is enough if one “recklessly disregards” that a depicted individual did not consent. EG if you come across nudes of some celebrities on your lemmy instance, you better delete them immediately. Assuming that the celebrity consented to the images being shared sounds like “reckless disregard” to me. If it’s just someone, then it’s no problem.

      This definitely will make some people quite a lot of money.