There has been a ton of CSAM and CP arrests in the US lately, especially from cops and teachers, along with at least one female teacher seducing boys as young as 12. I cannot understand the attraction to kids. Even teens. Do these people think they are having a relationship, or it is somehow okay to take away another human beings’ innocence? Is it about sex, power, or WTH is it? Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues? I am a grandfather, and I am worried about how far this will go with the new AI being able to put anyone’s face into a Porno movie too.

It seems to me that a whole new set of worldwide guidelines and laws need to be put into effect asap.

How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

  • @Deestan@lemmy.world
    link
    fedilink
    38
    edit-2
    9 months ago

    To answer some of the questions:

    I cannot understand the attraction to kids.

    There was a TV interview with people who were seeking help for pedophilia. They described it as just plain horny sexual attraction that they knew they had to not act on. I guess people have different reasons, and some probably manage to rationalize it as “relationships” as you say.

    Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues?

    Whether it is a modified image of a real person, or a pure generated picture, they will fall under the same laws as for depicting it which is already uncontoversially illegal.

    How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

    Hard, as there are many ways to describe nudity or encourage the generator to weigh towards nudity. “Person with visible thighs, no skirt” and such.

    Easier to leave nudity out of the training data, which is already common.

    Then hard again because anyone can throw together a new image generator trained on what they want and no word filters.