‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • @Imgonnatrythis@sh.itjust.works
    link
    fedilink
    English
    391 year ago

    I use an ad blocker and haven’t seen these. Perhaps a link to the best ones could be shared here for better understanding of what the article is talking about?

    • @jivandabeast@lemmy.browntown.dev
      link
      fedilink
      English
      91 year ago

      Sus question lmfao

      These things have been around since the onset of deepfakes, and truly if you take a couple seconds to look you’ll find them. It’s a massive issue and the content is everywhere

            • @jivandabeast@lemmy.browntown.dev
              link
              fedilink
              English
              -31 year ago

              No I disagree because before you could tell a fake from a mile away, but deepfakes bring it to a whole new level of creepy because they can be EXTREMELY convincing

                • @Delta_V@midwest.social
                  link
                  fedilink
                  English
                  21 year ago

                  Or maybe an accessibility improvement. You don’t need to practice creating your own works of art over many years anymore, or have enough money to commission a master artist. The AI artists are good enough and work for cheap.

                • @barsoap@lemm.ee
                  link
                  fedilink
                  English
                  11 year ago

                  The difference is that we now can do video. I mean in principle that was possible before but also a hell of a lot of work. Making it look real hasn’t been a problem since before Photoshop, if anything people get sloppy with AI also because a felt 99% of people who use AI don’t have an artistic bone in their body.

                • @jivandabeast@lemmy.browntown.dev
                  link
                  fedilink
                  English
                  11 year ago

                  I’m not saying that it’s a shift in nature? All I’ve been saying is:

                  A) tools to create realistic nudes have been publicly available ever since deepfakes became a thing

                  B) deepfakes are worse than traditional photoshopped nudes because (as you put it, a quality improvement) they’re more convincing and therefore can have more detrimental effects

              • @lolcatnip@reddthat.com
                link
                fedilink
                English
                41 year ago

                There was a brief period between now and the invention of photography when that was true. For thousands of years before that it was possible to create a visual representation of anything you imagine without any hint that it wasn’t something real. Makes me wonder if there were similar controversies about drawings or paintings.

        • @NeoNachtwaechter@lemmy.world
          link
          fedilink
          English
          -11 year ago

          And you are sure that ‘someone’ is of legal age, of course. Not blaming you. But does everybody always know that ‘someone’ is of legal age? Just an example to start thinking.

            • @andros_rex@lemmy.world
              link
              fedilink
              English
              11 year ago

              Depends on where you live. Not legal in the UK for example. In the US it can even be broken down at the state level, although there’s lots of debate on whether states are able to enforce their laws. “Obscene” speech is not protected under free speech, the argument would be whether or not the naked drawings had artistic merit or not.

              I’m not a lawyer, but I do know that people in the US have gone to prison for possessing naked images of fictional children and it’s on the books as illegal in many other countries.

        • @Tyfud@lemmy.one
          link
          fedilink
          English
          -11 year ago

          It’s what the courts think, and right now, it’s not clear what the enforceable laws are here. There’s a very real chance people who do this will end up in jail.

          I believe prosecutors are already filling cases about this. The next year will decide the fate of these AI generator deepfakes and the memories behind them.