There has been a ton of CSAM and CP arrests in the US lately, especially from cops and teachers, along with at least one female teacher seducing boys as young as 12. I cannot understand the attraction to kids. Even teens. Do these people think they are having a relationship, or it is somehow okay to take away another human beings’ innocence? Is it about sex, power, or WTH is it? Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues? I am a grandfather, and I am worried about how far this will go with the new AI being able to put anyone’s face into a Porno movie too.

It seems to me that a whole new set of worldwide guidelines and laws need to be put into effect asap.

How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

  • @cooopsspace@infosec.pub
    link
    fedilink
    English
    23
    edit-2
    9 months ago

    At the end of the day, art is just pixels on a flat surface. Determining whether a depicted individual is under age where it’s not obvious sets a dangerous precedent. Is the picture 17 or 18? Who knows.

    But the problem is that people have been sexualising people like Emma Watson since she first appeared on screen. That’s not okay and rather than sending AI art underground I think society needs to change to normalise education about sex, reproduction and genitalia and address the social issues to treat pedophilia like the disease that it is.

    Meanwhile pedophile names are being written about publicly, risking mob violence and further isolation. Not to mention in the US theres a lot of negative attention being put on women’s reproduction, childrens sex ed and genitalia and a push to make the whole lot illegal and taboo. Not to mention people teaching their kids pet names for their parts, “uncle Ben touched my heehaw” sounds a lot different to “uncle Ben touched my penis”.

    Society is a problem, the US particularly is going in the wrong direction on many aspects of sex education.