A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.

The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”

Archive

  • @fishos@lemmy.world
    link
    fedilink
    English
    1610 months ago

    And here we have the real answer: prudism. “It’s gross”. And of course “think of the children”. You don’t have a real answer, you have fear mongering

    • @MagicShel@programming.dev
      link
      fedilink
      1510 months ago

      I agree the issue is one of puritan attitudes toward sex and nudity. If no one gave a fuck about nude images, they wouldn’t be humiliating, and if they weren’t humiliating then the victim wouldn’t really even be a victim.

      However we live in the world we live in and people do find it embarrassing and humiliating to have nude images of themselves made public, even fakes, and I don’t think it’s right to tell them they can’t feel that way.

      They shouldn’t ever have been made to feel their bodies are something to be embarrassed about, but they have been and it can’t be undone with wishful thinking. Societal change must come first. But that complication aside, I agree with you completely.

      • @gapbetweenus@feddit.de
        link
        fedilink
        710 months ago

        Even without being puritan, there are just different levels of intimacy we are willing to share with different social circles - which might be different for everyone. It’s fundamental to our happiness (in my opinion) to be able to decide for ourselves what we share with whom.

          • @gapbetweenus@feddit.de
            link
            fedilink
            410 months ago

            You might not, but others do. People have rather different thresholds when it comes to what they consider intimate. I recommend to just listen to interviews with victims and it becomes clear that to them the whole thins is very intimate and disturbing.

            • @MagicShel@programming.dev
              link
              fedilink
              510 months ago

              And I said their feelings are valid and should be respected regardless of how I might feel about them. I’m not sure if you are looking for something more from me here. Despite my personal feelings that nudity shouldn’t be a source of shame, the fact is that allowing nudity to be used to hurt folks on the premise that nudity is shameful is something I utterly oppose. Like, I don’t think you should be ashamed if someone has a picture of you naked, but the real enemy is the person saying, “haha! I have pictures of you naked!!!” Whether the pictures are AI, or photoshopped, or painted on a canvas, or even real photographs.

    • @gapbetweenus@feddit.de
      link
      fedilink
      710 months ago

      So you would not mind if I send AI sex videos of you to your parents and friends? How about a video where you are sexually degraded playing in public space - how would you feel about that? Maybe you performing sexual acts that you find gross yourself? You just need a bit of empathy to understand that not everyone is into exhibitionism and wants intimate things become public.

      • @Serinus@lemmy.world
        link
        fedilink
        1010 months ago

        I’d really prefer that people not send my parents any kind of porn.

        I look at it like someone took my face out of a Facebook picture, printed it, cut it out, pasted it over some porn, and did the same thing.

        It’d be a weird thing for them to do, but I don’t really need to send the law after them for it. Maybe for harassment?

        Laws have a cost, even good intentioned laws. I don’t believe we need new ones for this.

        • @gapbetweenus@feddit.de
          link
          fedilink
          110 months ago

          Do you think people might change their opinion on you and act differently after seeing you performing in porn?

          Laws have a cost, even good intentioned laws.

          It causes distress to victims, arguably violates personal rights and is moral and ethically at least questionable. What would be downsides of criminal persecution for non-consensual sexual Deepfakes?

          • @Serinus@lemmy.world
            link
            fedilink
            410 months ago

            If they understand that this kind of porn exists? No.

            But that’s an education thing, not a legal thing.

            The downside is giving law enforcement yet another excuse to violate digital privacy. Laws that are difficult/impossible to enforce tend to do more harm than good.

            I don’t see this law removing any fake Taylor Swift porn from the Internet. Or really any other celebrity, for that matter.

            • @gapbetweenus@feddit.de
              link
              fedilink
              -3
              edit-2
              10 months ago

              If they understand that this kind of porn exists? No.

              You know people form opinions on actors based on their roles in movies? So people will change what they think of you and how they act towards you based on media, even if it’s clearly fictional.

              The downside is giving law enforcement yet another excuse to violate digital privacy. Laws that are difficult/impossible to enforce tend to do more harm than good.

              How exactly? Which new abilities to violate digital privacy is given the state by the this bill?

          • Montagge
            link
            fedilink
            410 months ago

            Yeah, but it’s happening to women mostly so these commenters probably don’t really care.

            • @gapbetweenus@feddit.de
              link
              fedilink
              210 months ago

              I think a lot of man have unfortunately difficulties to empathize with women here, because they have rather different experience when it comes to expressing their sexuality and possible negative consequences.

      • Zellith
        link
        fedilink
        510 months ago

        “So you would not mind if I send AI sex videos of you to your parents and friends?”. Seems like sending it would be the dick move. My family and friends probably have no interest in seeing deepfakes of me naked.

        “How about a video where you are sexually degraded playing in public space - how would you feel about that?” Considering its not really me… meh. I don’t personally care. Because it’s not me.

        “Maybe you performing sexual acts that you find gross yourself?” If someone wants to make deepfakes of me eating poop or something for whatever reason… oh well? It’s not really me.

        But you do you.

        • @gapbetweenus@feddit.de
          link
          fedilink
          -210 months ago

          My family and friends probably have no interest in seeing deepfakes of me naked.

          It mostly not saying that it’s deepfakes of you. It’s just a whats app message from someone who does not like you and you have to explain a whole new technology to your parents.

          Considering its not really me… meh. I don’t personally care. Because it’s not me.

          You know it, others don’t. This will greatly change others perception of you and how they treat you.

          It’s not really me.

          Your boss and coworkers don’t know.

          But you do you.

          No, but I have empathy with other people.

    • @shiroininja@lemmy.world
      link
      fedilink
      010 months ago

      Not at all. Think of the consequences of someone’s nudes were leaked or an onlyfans account was made with images of them, and an employer sees it. They’re already firing teachers for being on there. And a lot of times they’re used in extortion. Not to mention your image is your property. It is you. And nobody else has rights to that.

        • @shiroininja@lemmy.world
          link
          fedilink
          210 months ago

          You don’t have to take nudes anymore to have nudes leaked. There are Ai that strip clothes from pictures. People have been making csam off of pictures of peoples kids on their Instagram profiles,etc.