There has been a ton of CSAM and CP arrests in the US lately, especially from cops and teachers, along with at least one female teacher seducing boys as young as 12. I cannot understand the attraction to kids. Even teens. Do these people think they are having a relationship, or it is somehow okay to take away another human beings’ innocence? Is it about sex, power, or WTH is it? Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues? I am a grandfather, and I am worried about how far this will go with the new AI being able to put anyone’s face into a Porno movie too.

It seems to me that a whole new set of worldwide guidelines and laws need to be put into effect asap.

How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

  • @Candelestine@lemmy.world
    link
    fedilink
    English
    91 year ago

    I don’t know many of the answers to these questions, I’m no social scientist or doctor. On the tech side of things, this is all very new and we are still coming to grips with it.

    I feel pretty comfortable fielding this one though: There should be no exceptions granted for AI generated pornographic content, and a person’s facsimile should have the same protections as actual photos of them. I do not think many people will find this controversial among the general public, as it would pretty clearly serve to protect all of us from having our identities used against our will.

    I expect that even our congress should be able to get at least something on the books in this direction. Eventually. Maybe. Or at least the FCC or something.

    • @Delphia@lemmy.world
      link
      fedilink
      221 year ago

      Let me preface this by saying I DO NOT SUPPORT CSAM!

      The only issue I take with AI generated images is that theres no true “age” to the picture and any legislation that would allow people to be jailed or charged would have to be worded very carefully.

      “Depicting clearly underage subject matter if it were a person or using prompts to generate someone who clearly appears under aged” simply because someone could be marked for life for typing in “naked elf” and the program spits out something with small boobs and childlike features and not having their HD shredded immediately.

        • @Delphia@lemmy.world
          link
          fedilink
          151 year ago

          Thats exactly my point. Sure the courts may rule in your favor eventually but you just got marched out of work in handcuffs for possession of CSAM, your entire personal and professional circle knows and any explanation you offer is going to sound like total bullshit.

          “It was an AI generated image, and it was an elf! She just looked young, but not like illegal young! Guys you have to believe me!”

          • @CeruleanRuin
            link
            English
            -31 year ago

            As if I needed another reason to never watch anime.

        • @CeruleanRuin
          link
          English
          3
          edit-2
          1 year ago

          “Sir, I’m detecting a strong odor of weeb coming from this vehicle.”

      • FuglyDuck
        link
        fedilink
        English
        -21 year ago

        “Depicting clearly underage subject matter if it were a person or using prompts to generate someone who clearly appears under aged” simply because someone could be marked for life for typing in “naked elf” and the program spits out something with small boobs and childlike features and not having their HD shredded immediately.

        Has that ever happened though? I don’t think it happens as much as people imagine it does. This is an issue with any CP, not just ai generated stuff.

          • FuglyDuck
            link
            fedilink
            English
            21 year ago

            So…. 1) they were obviously child like.

            However, while the cartoon characters were elves and pixies, they were also clearly young elves and pixies, which led to concerns the images were linked to child sexual abuse.

            1. he didn’t just view it, he downloaded it and kept it in a spank bank for 3 years.

            Ronald Clark downloaded the Japanese anime cartoons three years ago, setting in train events that would see him in court in Auckland and jailed for three months for possessing objectionable material

            1. he had prior convictions for sexually assaulting a 12 year old boy- and I’m guessing there’s parole agreements to not have CP material.

            Clark has previous convictions for indecently assaulting a teenage boy and has been through rehabilitation programmes, but the video nasties he was watching in this case were all cartoons and drawings.

            1. it was for the artistic merit! Uh huh. I watch porn for the story too! /s

            Clark admitted he was interested in the images but he said it was for their artistic merit and as “a bit of a laugh”. He did not find them sexually arousing, he said.

            I think it’s safe to say he didn’t get convicted for simply viewing a search result. So I stand by what I said: it doesn’t happen as often as people think it does. Even if you come down on one side… that’s one instance in a global world.

      • @Candelestine@lemmy.world
        link
        fedilink
        English
        -61 year ago

        It’s not congress’ job to make perfect laws, that’d be even less efficient than the one we have now. The interpretation and “fine-tuning” of law is the job of courts, and is literally handled case-by-case and in the hands of judges and literal armies of lawyers every year.

        • @Delphia@lemmy.world
          link
          fedilink
          131 year ago

          Ok, when someone gets arrested for possession of CSAM because someone decided those pixels looked a little young and their entire life comes crashing down around their ears, the arrest makes the papers and its forever out there. I’m sure they wont mind at all because after they had to financially ruin themselves to beat the charges in court they were found innocent.

          Or

          The law gets written well by people that we the people pay to do their job and write the laws.

          • @Candelestine@lemmy.world
            link
            fedilink
            English
            51 year ago

            Except everyone handwaves away the whole “written well” or “carefully written” part, because there’s no actual good place to draw that line. Which is why you and nobody else can think of one. It will always be eventually up to some judges/juries subjective interpretations, regardless of what line you gave them to work with.

            So, they’re just going to ban it all. They do not consider digital art collections to be of very great importance, they’re more interested in things that involve lots and lots of money, generally. Or things that can be used to stoke fear, so they can come in and get a legislative notch on their belt “saving the day” with some half cocked fix.

            • @CeruleanRuin
              link
              English
              11 year ago

              Just to get it straight, by “digital art collections” are you talking about that folder of anime girls on your computer?

              • @Candelestine@lemmy.world
                link
                fedilink
                English
                11 year ago

                No, when I say they don’t care about digital art collection, I really mean all digital art. They won’t ban all digital art, of course, they’re just going to make no exceptions for AI generated artwork when considering what is or is not banned. Similarly, using someone’s facsimile, which is not currently illegal afaik, should be.

    • FuglyDuck
      link
      fedilink
      English
      01 year ago

      Even entirely fake/generated characters.

      CSAM is one of the tools groomers use to groom kids. “See this is normal. Here look at this. Doesn’t that look like fun?”

      There are no easy answers here. The issue is complex and extremely difficult, and I don’t think anyone has it figured out.