Scarlett Johansson hits AI app with legal action for cloning her voice in an ad | An AI-generated version of Scarlett Johansson’s voice appeared in an online ad without her consent.::Scarlett Johansson is taking legal action against an AI app developer for using her likeness in an online ad without her consent.

        • @SlapnutsGT@lemmy.world
          link
          fedilink
          English
          4
          edit-2
          1 year ago

          I do not and good luck finding it. Searches are nothing but articles or video news broadcasts about the ad but I haven’t been able to find the actual ad.

            • eric
              link
              fedilink
              English
              41 year ago

              There’s no way they’d have a case unless the voice impersonated her, so I’d be shocked if the commercial didn’t present it as her, but I couldn’t find a copy of it online either. I did hear the fake Tom Hanks ad a while back, and it definitely claimed to be him, so this sort of throng has happened before. Also, there would be no point in using her voice unless the audience thinks it’s her.

                • eric
                  link
                  fedilink
                  English
                  51 year ago

                  Sure, in that hypothetical, there would be no case, but that would make her legal team absolute idiots for thinking that simple mimicry would be a case they could win. And there is absolutely no reason to think they are idiots since Johansson doesn’t have a history of frivolous lawsuits or losing her legal battles. She went up against hackers stealing her nudes as well as Disney and won both cases, so I’m going to give her legal team the benefit of the doubt here.

    • Dran
      link
      fedilink
      English
      4
      edit-2
      1 year ago

      Fwiw I disagreed with you but upvoted for making a reasoned argument. We do need to drop that reddit mentality of downvote what you disagree with. IMHO you should downvote things that are either demonstrably false, or low-effort.

      That said, I think both voice/image impersonation individually would fit the bill for “intent to deceive”. I’d be surprised if it didn’t already have a lot of legal precedent in the realm of advertising.

      https://casetext.com/case/waits-v-frito-lay-inc

      The tom watts case is the only one I’m aware of off the top of my head, but the TL/Dr is they tried to license a song of his to use, he refused, so they just hired an impersonator to sing in his style instead. He sued Frito lay and won.