• @Gigan@lemmy.world
    link
    fedilink
    English
    638 months ago

    If they’re going to go this way, I don’t think it should be limited to just porn. There are plenty of ways you could ruin someone’s life without a deepfake being sexually explicit.

    • @Deestan@lemmy.world
      link
      fedilink
      English
      178 months ago

      There already are a lot of laws covering that. This one is to cover an additional angle where people create deepfake without provably publishing it, the intent being that showing it to friends and verbally threatening to “leak” them should be easier to prosecute.

      If you create a deepfake and share it, you’re slapped with two crimes.

      • FaceDeer
        link
        fedilink
        108 months ago

        the intent being that showing it to friends and verbally threatening to “leak” them should be easier to prosecute.

        That’s blackmail, which is already illegal.

        • @nogooduser@lemmy.world
          link
          fedilink
          English
          78 months ago

          Using a mobile phone while driving has always been illegal if you could argue that it was dangerous driving or driving without due care and attention. They made a law specifically saying that using a mobile phone without hands free is illegal anyway. This makes it easier to prosecute because you don’t need to argue that they were driving dangerously or without due care.

          I imagine that this law has the same intent of making this specific act illegal to prevent them having to argue that it fits another crime.

    • @billbasher@lemmy.world
      link
      fedilink
      English
      88 months ago

      Yeah the way people can recreate someone “in need of assistance” to trick family or associates is really scary especially for people who aren’t exactly tech savvy. That seems to me to be a worse crime than an explicit video that is pretty obviously doctored

  • @usualsuspect191@lemmy.ca
    link
    fedilink
    English
    208 months ago

    I wonder what happens when it just accidentally looks like someone but was intended to be a fictional person. Also, how much can you base it on a real person before it’s considered a deep fake of that person? Would race-swapping be enough to make it a “new” person so it’s not illegal anymore? My intuition is that just eye colour or something wouldn’t be enough, but it’s a sliding scale where the line must be drawn somewhere even if it’s a fuzzy line.

    What about an AI generated mashup of two people like those “what the child would look like” pictures back in the day. Does that violate both people or neither?

    What about depicting a person older than they are now? That’s technically not somebody that exists, but might in the future.

    What if you use AI but make it look like it’s hand-drawn or a cartoon?

    What if you use AI to create sexual voice clips of a real person but use images that don’t look like them or no image at all?

    There are just so many possibilities and questions that I feel it might be impossible to legislate in a way that isn’t always 10 steps behind or has a million unforeseen consequences.

    • @Not_mikey@slrpnk.net
      link
      fedilink
      English
      58 months ago

      There’s already laws against using someone’s likeness for commercial purposes without their consent, I’m guessing this will require the same fuzzy cutoff and basically just be up to the jury to decide or the judge to dismiss.

    • @WamGams@lemmy.ca
      link
      fedilink
      English
      -78 months ago

      Well, let’s find out. Please give me 20 sample photos of you, 30 minutes of audio and 10 of video.

      I’m going to have you get gangbanged by 100 German men and upload it to xvideos.

      Now, that is probably something you deserve to consent to, isn’t it?

  • @cygnus@lemmy.ca
    link
    fedilink
    English
    118 months ago

    I have a hard time accepting this as a crime. What if the illustration hand-drawn, or clothed but still sexual in character? Is caricature illegal, by this standard?

      • @MareOfNights@discuss.tchncs.de
        link
        fedilink
        English
        6
        edit-2
        8 months ago

        Yea, this is a funny thing to think about.

        You can jerk off to photos of people, you can imagine some wild things involving other people etc.

        If you just create some deepfake porn to masturbate by yourself to, I don’t see a big problem with that.

        The only issue I can find is, that due to neglect someone else sees it, or even hears about it.

        The problem starts with sharing. Like it would be sexual harassment to tell people who you are masturbating to, especially sharing with the “actors” of your fantasies.

        There is however another way this could go:

        Everyone can now share nudes with way less risk.

        If anyone leaks them, just go: “That’s a deepfake. This is obviously a targeted attack by the heathens of the internet, who are jealous of my pure and upstanding nature. For me only married missionary with lights out.”

            • @WamGams@lemmy.ca
              link
              fedilink
              English
              58 months ago

              So if I use AI to make pornography of 50 men gang banging you, you will consider that to be on the same level as going to a carnival and getting a characture done?

              • @CanadaPlus@futurology.today
                link
                fedilink
                English
                1
                edit-2
                2 months ago

                Huh, you must have replied somewhat late to this - I’m sure I checked back here for any replies before I returned to my main instance for good.

                Actually, yes. If you sent it to me, that would be sexual harassment (just like if you sent me an unsolicited text description of what you want to do to me), but I don’t care what you do in private.

      • @capem@startrek.website
        link
        fedilink
        English
        -8
        edit-2
        8 months ago

        Ooohh, can’t wait to see us waste billions of dollars deliberating what is acceptable just like with copyright law.

        This is another law that only exists to protect rich people. Poor people can’t afford a lawyer and don’t have time to show up in court.

        • @andrewta@lemmy.world
          link
          fedilink
          English
          48 months ago

          You seriously can’t see why deep fakes are a serious problem to everyone?

          This law won’t protect just rich.

          Imagine the chaos as some idiot teen creates a deep fake of some other teen in a compromising position.

          Go talk to an attorney and see what they have to say about it.

    • @Deestan@lemmy.world
      link
      fedilink
      English
      108 months ago

      Is caricature illegal, by this standard?

      No.

      The official government announcement is linked in the article btw.

  • @gedaliyah@lemmy.world
    link
    fedilink
    English
    7
    edit-2
    8 months ago

    This is why we should be making laws around likeness rights. If you damage somebody by publicly using their name to spread falsehoods, that’s defamation or libel. But, if you produce an image or video of their likeness instead of using their name, there’s no legal recourse. Makes no sense this day in age

    • @cygnus@lemmy.ca
      link
      fedilink
      English
      -28 months ago

      Who decides how similar somebody is “allowed” to look to another? There are people who bear an uncanny resemblance to others. And what of identical twins? Can one sue the other if they do porn?

  • Flying SquidM
    link
    fedilink
    English
    68 months ago

    “Without consent.” I’m very curious who would consent to having deepfake porn made of themselves.

    • BraveSirZaphod
      link
      fedilink
      128 months ago

      I can imagine a non-zero amount of people would consent to a deep-fake porn video of themselves having sex with some generic hot woman, just as one example.

      • Flying SquidM
        link
        fedilink
        English
        28 months ago

        That makes sense, I hadn’t thought of that sort of deepfake.

      • @dan1101@lemm.ee
        link
        fedilink
        English
        -18 months ago

        Better make sure the generic hot woman doesn’t resemble anyone real though.

    • @Not_mikey@slrpnk.net
      link
      fedilink
      English
      88 months ago

      Could be very lucrative if you are already in porn and want to make some money from your likeness. This guy’s gonna pay me $500 to make a video and I don’t even have to do anything?

      Could also be very good for porn stars who have “aged out” but can still make videos using their younger bodies as weird as that may be.

    • @Grimy@lemmy.world
      link
      fedilink
      English
      5
      edit-2
      8 months ago

      A user shared a story a while back about his wife and her sister giving photos and agreeing to it. Lots of kinky people out there.

  • theodewere
    link
    fedilink
    58 months ago

    seems like the only way to deal with it… make it equivalent to sexual assault…

    • FaceDeer
      link
      fedilink
      18 months ago

      A naked picture of me simply existing is not equivalent to sexual assault. If you want to make it illegal then treat it as its own thing.

    • @capem@startrek.website
      link
      fedilink
      English
      -38 months ago

      The only way to deal with it is to let so much of it flood the digital world that nobody cares anymore because there’s a deepfake porno of everyone.

      This is a waste of money to ensure rich people don’t get porn made of them by poor people.

      Poor people won’t be able to afford lawyers and aren’t able to take time off to show up in court.

  • @tal@lemmy.today
    link
    fedilink
    English
    08 months ago

    “Deepfake pornography is a growing cause of gender-based harassment online and is increasingly used to target, silence and intimidate women — both on and offline,” Meta Oversight Board Co-Chair Helle Thorning-Schmidt said in a statement.

    considers

    I think that there’s an argument for taking the opposite position. If someone could make deepfake porn trivially and it were just all over the place, nobody would care about it; one knows that it’s fake.

    In fact, it’d kind of make leaked actual pornography no-impact as a side effect, unless there were a way to distinguish distinguish between deepfakes. And that’s a harder issue to resolve. I was reading a discussion yesterday about sextortion on here and talking about how technically-difficult it would be to keep someone from recording sex video chats, that there’d always be an analog hole at least. But…there is another route to solve that, which is simply to make such a video valueless because there’s a flood of generated video.

    • @intrepid@lemmy.ca
      link
      fedilink
      English
      28 months ago

      Deep fake recognition is already available. And, while what you predict sounds logical, these criminals prey on emotions. I feel that a lot of innocent people will be victimized even if deep fake porn becomes common.