Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

    • @BreakDecks@lemmy.ml
      link
      fedilink
      English
      457 months ago

      This was from a test I did with a throwaway account on IG where I followed a handful of weirdo parents who run “model” accounts for their kids to see if Instagram would start pushing problematic content as a result (spoiler: yes they will).

      It took about 5 minutes from creating the account to end up with nothing but dressed down kids on my recommendations page paired with inappropriate ads. I guess the people who follow kids on IG also like these recommended photos, and the algorithm also figures they must be perverts, but doesn’t care about the sickening juxtaposition of children in swimsuits next to AI nudifying apps.

      Don’t use Meta products. They don’t care about ethics, just profits.

      • Karyoplasma
        link
        fedilink
        English
        22
        edit-2
        7 months ago

        The other day, I had an ad on facebook that was basically lolicon. It depicted a clearly underage anime girl in a sexually suggestive position on a motorcycle with their panties almost off. I am in Germany, Facebook knows I am in Germany and if I took a screenshot of that ad and saved it, it would probably be classed as CSAM in my jurisdiction. I reported the ad and got informed that FB found “nothing wrong” with it a few days later. Fuck off, you child predators.

        • @BreakDecks@lemmy.ml
          link
          fedilink
          English
          97 months ago

          I logged into my throwaway account today just to check in on it since people are talking about this shit more. I was immediately greeted with an ad featuring hardcore pornography, among the pics of kids that still populate my feed.

          I’ll spare you the screenshot, but IG is fucked.

      • @LucidBoi@lemmy.dbzer0.com
        link
        fedilink
        English
        117 months ago

        Coupled with the article about pedos blackmailing kids with their fake nudes to get real ones, this makes my stomach turn and eyes water. So much evil in this world. I am happy to say I deleted my FB and IG accounts a few days ago. WhatsApp is tough to leave due to family though… Slowly getting people to switch over to safer and more ethical alternatives.

        • @BreakDecks@lemmy.ml
          link
          fedilink
          English
          87 months ago

          This 100%. I can’t even bring myself to buy new content for my Quest now that I’m aware of the issues (no matter how much I want the latest Beat Saber and Synth Riders DLC), especially since Meta’s Horizon, in my experience, puts adults into direct contact with children. At first I just dismissed metaverse games like VRChat or Horizon as being too popular with kids for me to enjoy it, but now I realize that it put me, an adult, straight into voice chats with tweens, which people should fucking know better than to do. My first thought was to log off because I wasn’t having fun in a kid-dominated space, but I have no doubt that these apps are crawling with creeps who see that as a feature rather than a problem.

          We need education for parents that sharing pictures of their kids online comes with real risks, as does giving kids free reign to use the Internet. The laissez faire attitude many people have towards social media needs to be corrected, because real harm is already being done.

          Most of the parents that post untoward pics of their kids online are chasing down opportunities for their kids to model, and they’re ignoring the fact that a significant volume of engagement these photos receive comes from people objectifying children. There seems to be a pattern that the most revealing outfits get the most engagement, and so future pictures are equally if not more revealing to chase more engagement…

          Parents might not understand how disturbing these patterns are until they’ve already dumped thousands of pictures online, and at that point they’re likely to be in denial about what they’re exposing their kids to, and/or too invested to want to reverse course.

          We also need to have a larger conversation, as a society, about using kids as models at all. Pretty much every major manufacturer of children’s clothing is hiring real kids to model the clothes. I don’t think it’s necessary to be publishing that many pictures of kids online, nor is it acceptable to be doing so for profit. There’s no reason not to limit modeling to adults who can consent to putting their bodies on public display, and using mannequins for kids’ clothing. The sheer volume of kids’ swimsuit and underwear pictures hosted on e-commerce sites is likely a contributor to the capability Generative AI models have to create inappropriate images of children, not to mention the actual CSAM found in the LAION dataset most of these models are trained on.

          Sorry for the long rant, this shit pisses me off. I need to consider sending 404 Media everything I know since they’re doing investigations into this kind of thing. My small scale investigation has revealed a lot to me, but more people need to be getting as upset as I am about it if we want to make the Internet less of a hellscape.