• Pooptimist
    link
    fedilink
    English
    971 year ago

    Hear me out on this one:

    If we take it as given that pedophilia is a disorder and ultimatly a sickness, wouldn’t it be better that these people get their fix from AI created media than from the real thing?

    IMO there was no harm done to any kid in the creation of this and it would be better to give these people the fix they need or at least desperately desire in this way before they advance to more desperate and harmful measures.

    • @DrPop@lemmy.one
      link
      fedilink
      English
      781 year ago

      You have a point, but in at least one of these cases the images used were of the girls around them and even tried extorting one is then. Issues like this should be handled on a case by case basis.

    • @hoshikarakitaridia@sh.itjust.works
      link
      fedilink
      English
      521 year ago

      Some of the comments here are so stupid: “either they unpedophile themselves or we just kill them for their thoughts”

      Ok so let me think this through. Sexual preferences in any way or pretty normal and they don’t go away. Actually if you tend to ignore them they become stronger. Also being a pedophile is not a crime currently. It’s the acting on it. So what happens right now is that people bottle it up, then it gets too much and they act on it in gruesome ways, because “if I go to prison I might as well make sure it was worth it”. Kids get hurt.

      “But we could make thinking about it illegal!” No we can’t. Say that’s a law, what now? If you don’t like someone, they’re a “pedophile”. Yay more false imprisonment. Also what happens to real pedophiles? Well they start commit more acts because theres punishment even for restraint. And the truth is a lot of ppl have pedophilic tendencies. You will not catch all of them. Things will just get worse.

      So why AI? Well as the commenter above me already said, if there’s no victim, there’s no problems. While that doesn’t make extortion legal (I mean obv. it’s a different law), this could make ppl with those urges have more restraint. We could even still limit it to specific sites and make it non-shareable. We’d have more control over it.

      I know ppl still want the easy solution which evidently doesn’t work, but imo this is a perfect solution.

      • @foggenbooty@lemmy.world
        link
        fedilink
        English
        191 year ago

        I largely agree with what you’re saying and there definitely is no easy solution. I’ve never understood why drawings or sex dolls depicting underage people are illegal in some places, as they are victimless crimes.

        The issue with aigen that differentiates it a bit from the above is the fidelity. You can tell a doll or an anime isn’t real, but in a few years from now it’ll b difficult to spot aigen images. This isn’t unique to this scenario though, it’s going to wreck havok on politics, scams, etc, but there is the potential that real CP comes out from hiding and is somewhat shielded by the aigen.

        Of course this is just speculation, I hope it would go the other way around and everyone just jacks off at their computers and CP disappears completely. We need to stop focusing our attention on people with pedophila, get them mental support, and focus on sex offenders who are actually hurting people.

        • Queen HawlSera
          link
          fedilink
          English
          31 year ago

          I’m all for letting people have their dolls, drawings, and AI generated stuff, but yeah… it would become easy for offenders to say “Naw, I snatched that shit off of DALL-E.” and walk in court, so some kind of forensic tool that can tell AI Generated Images from Real Ones would have to be made…

          Actually there’s a lot of reasons we’d want a tool like that that have nothing to do with hypothetical solutions to kiddie diddling.

          Can you imagine how easy extortion would become if you could show an AI pictures of your neighbor next door killing some rando missing person in the area? But every new technology enables crime, until we find out what the proper safeguards are so I’m not too worried about it in the long-term.

          • @Anomalous_Llama@lemmy.world
            link
            fedilink
            English
            11 year ago

            It’s also worth remembering that to make super accurate pictures of children the AI may have been trained on illegal photos (on purpose or not)

            But that’s an issue for the AI creator

      • Queen HawlSera
        link
        fedilink
        English
        101 year ago

        I pretty much agree, while we should never treat Pedophilia as “Just another perfectly valid sexuality, let’s throw a parade, it’s nothing to be ashamed of” (Having the urge to prey on children is ABSOLUTELY something to be ashamed of even if you can’t control it.), we need to face facts… It isn’t someone waking up one day and saying “Wouldn’t it be funny if I took little Billy out back and filled him full of cock?”

        It’s something going on in their head, something chemical, some misfiring of the neurons, just the way their endocrine system is built.

        As much as I’d love to wave a magic wand over these people I reluctantly call people and cure them of their desires, we don’t have the power to do that. No amount of therapy in the world can change someone’s sexual tastes.

        So in lieu of an ideal solution, finding ways to prevent pedophiles from seeking victims in the first place is the next best thing.

        It’s not dissimilar to how when we set up centers for drug addicted people to get small doses of what they’re addicted to so that they can fight withdrawal symptoms, crimes and death rates go down. When you enact things like universal basic income and SNAP, people have less of a reason to rob banks and gas stations so we see less of them.

        It’s not enough to punish people who do something wrong, we need to find out why they’re doing it and eliminate the underlying cause.

      • @Skwerls@discuss.tchncs.de
        link
        fedilink
        English
        1
        edit-2
        1 year ago

        There’s also a difference (not sure if clinically) between people who sexualize really young kids and someone who likes kids that are under the age that whatever society has decided splits children and adults. In the USA porn depicting the latter is fine as long as everyone is over the age of adulthood, even if they dress up to look younger.

        I think in general people who refer to pedophilia are usually referring to the former and not the 30 year old dating a 17 year old or whatever. But the latter makes it a little weird. Images of fictional people don’t have ages. Can you charge anyone who has aigen porn with csam if the people depicted sorta look underage?

        Ai generated content is gonna bring a lot of questions like these that we’re gonna have to grapple with as a society.

        • @hoshikarakitaridia@sh.itjust.works
          link
          fedilink
          English
          11 year ago

          The first part of your comment is rather confusing to me, but the latter part I fully agree with. Decoding age on appearance is a thing that will haunt us even more with AI until we face new solutions. But that is gonna be one of a list of big questions to be asked in conjunction with new AI laws.

      • Ataraxia
        link
        fedilink
        English
        -21 year ago

        Pedo isn’t a sexual preference anymore than cannibal a dietary one…

        • @hoshikarakitaridia@sh.itjust.works
          link
          fedilink
          English
          21 year ago

          You know what? Sure. Imagine I find ppl really taste, especially hands. But I never chew on one. I just think about it. Literally the same thing. You should be rewarded for restraint on these urges. If I’d get punished for thinking about munching on a thumb, I’d at least take a hand with me to jail. I’m going there anyway.

    • Queen HawlSera
      link
      fedilink
      English
      321 year ago

      That’s basically how I feel. I’d much rather these kinds of people jack it to drawings and AI Generated images if the alternative is that they’re going to go after real chidlren.

      • @Black_Gulaman@lemmy.dbzer0.com
        link
        fedilink
        English
        -481 year ago

        At some point the fake images won’t do it for them and then they’d fix their attention to real kids. We don’t want to wait for that to happen.

        It’s like using a drug with your threshold increasing each time you use, they’re will be a time that your old limit will have no effect on your satisfaction level.

          • @MBM
            link
            English
            01 year ago

            None of us are specialists here, so people saying it is and people saying it isn’t harmless are both speculating

          • @datavoid@lemmy.ml
            link
            fedilink
            English
            -21 year ago

            Seems like speculation, but personally I’d be amazed if it were completely incorrect.

            If people who are attracted to children are constantly looking at CP, they are inevitably going to become more comfortable with it. Same with any other type of porn - do you think people who watch tons of torture porn dont become increasingly unaffected by it? It’s also the same for any other illegal or shocking content. I spent enough time on 4chan 10+ years ago to vouch for this personally.

            I’m not saying that everyone who looks at these AI images will act on their desire, but some people will absolutely end up wanting more after having open access to pictures of naked children.

            Honestly it’s a bit concerning how people are voting this down, why do we value the sexual gratification of pedos higher than the potential safety of children?

            • @foggenbooty@lemmy.world
              link
              fedilink
              English
              211 year ago

              This is the War on Drugs argument all over again, except using porn instead of marijuana as a “gateway”.

              You’re correct that there can be some crossover and some unstable people could have an addiction that gets out of control, but I don’t think there’s any proof that happens in high enough numbers.

        • @papertowels@lemmy.one
          link
          fedilink
          English
          28
          edit-2
          1 year ago

          By your logic, does everyone who’s into bdsm have a sex dungeon in their bedroom?

          Your comment reduces everyone to their base fetishes, as if that were the only thing enacting pressure on an individual to act, and I don’t believe that’s the case.

          • Queen HawlSera
            link
            fedilink
            English
            81 year ago

            I’ll come right out and say it, I’m into inflation.

            The amount of times I’ve went out, bought a helium tank, and shoved a tube up anyone’s ass is just about equal to the amount of times I’ve been the Republican Candidate for the US Presidency… and I’m not even 35 yet.

            I think we all have weird kinks, it’s a part of the human experience.

            Heck imagine if we thought this way for EVERY sexual desire someone had.

            “Porn for people who prefer blondes? I dunno, what if they get carried away and start dying random brown haired people? The consequences are too great!”

            Sounds fucking ridiculous when you think of it that way.

        • Queen HawlSera
          link
          fedilink
          English
          12
          edit-2
          1 year ago

          Do you know how much porn there is of the My Little Pony characters? Tons

          Do you know how much of an epidemic there is of cartoon watchers going out and fucking ponies? Somewhere between null and zilch… Maybe one or two extreme cases, but that’s around the same amount of people who watch Super Hero movies and try to jump off the roof in order to fly.

          This is a slippery slope fallacy if I ever saw it.

          Heck, if anything we’ve seen that restrictions on porn actually leads to increased instances of sexual assault, in the same way a crackdown on drugs just leads to more deaths from overdoses.

          If letting some sicko have fake images of pretend children saves even one real child from being viciously exploited, I think it’s worth it.

          It’s not ideal and yeah, it makes the skin of any sane person crawl… Ideally we should be out curing pedophiles of their sexual urges entirely, but since we don’t have a way to do that why let perfect be the enemy of good? I mean what other ideas do we have? Cause “To Catch A Predator” may have been good television, but even that had ethical concerns ending in lawsuits lost and suicides performed, and castrated everyone convicted isn’t exactly 8th Amendment friendly… and even then that prevents repeat offenses, not initial offenses. (Prevention > Cure)

          Now all this aside, we do need to look at this on a case by case basis. If real children are being used to model for the AI or fake images are used as a form of blackmail (Think “Revenge Porn”, but way, way worse), then cuffs need to be slapped on people.

      • Queen HawlSera
        link
        fedilink
        English
        41 year ago

        And when it does, can’t do anything about it “While making sure this never happens again is a noble goal, let’s not politicize this tragedy.”

        Or as they say over in Europe “Apparently the Americans say there’s no way to prevent that problem that literally doesn’t happen anywhere else in the world.”

      • @mindbleach@sh.itjust.works
        link
        fedilink
        English
        -11 year ago

        My guy. We can pretty confident that expensive training using human-labeled data did not include child pornography. Nobody just slipped in the sort of images that are illegal to even look at.

    • snownyte
      link
      fedilink
      11 year ago

      That’d be like giving an alcoholic a pint by the end of the week to reward their alcoholic behavior that they’d want out of.

      That’d be like giving money to a gambling addict as they promise to ‘pay you back’ for the loan you’ve given them.

      My point is, enabling people’s worst habits is always a bad idea.

      And how can you guarantee for certain that after awhile of these AI-generated CP crap, that they eventually wouldn’t want the real thing down the road and therefore, attempt crimes?

      Your solution is just dumb altogether.

      • Lowlee Kun
        link
        fedilink
        English
        -11 year ago

        There is literally no data to back up your slippery slope argument.

        • snownyte
          link
          fedilink
          11 year ago

          You really like spamming that “slippery slope” term, don’t you? It’s like your ultimate go-to for feeling like you’re superior. Just wait until you use it in a context where you’ll look like a dumbass, one of these days in where it doesn’t fit.

    • Ataraxia
      link
      fedilink
      English
      -81 year ago

      Jfc what’s with these pedo apologists. If someone were a cannibal, would it be totally fine to just give them human flesh removed from surgeries or dead people? Maybe let him pay people to eat them and drink their blood? AI images are trained on actual CP and CP anyway should not be normalized. If someone has ideation of violence then the last thing you do is feed those ideations. Would you think a suicidal person should watch simulated suicide? Why would watching simulated acts of depraved violence because you enjoy them somehow prevent you from committing that act yourself? If you enjoy something that much then you are thinking about doing it yourself.

      • @mordred@lemmy.world
        link
        fedilink
        English
        41 year ago

        Actually the analogy here would be to give “wannabe cannibals” synthetic meat/stuff that tastes like human meat/stuff

    • @eatthecake@lemmy.world
      link
      fedilink
      English
      -141 year ago

      I dont beleive its a sickness. Humans vary in innumerable ways and defining natural variations as sickness is a social decision, not medical. If you look at the DSM you will find that that social problems are sometimes given as a reason for defining something as illness. This is just the medicalisation of everything.
      Even if you grant that its a sicknesd, how does it follow that sickness should therefore be treated by AI? I see no argument or logic here. Do you think harm would be done if the paedophile knows the child? If the child finds out they are the object of rape fantasies? If you find you are married to a person who gets off on raping children? Your children?
      Do you allow for disgust and horror at sadistic desires or are we ‘not allowed to kink shame’.

    • @treefrog@lemm.ee
      link
      fedilink
      English
      -171 year ago

      Sex offenders aren’t allowed to watch porn at all in my state.

      Because science suggests watching porn, and getting your fix as you put it, through porn, encourages the behavior.

      Watching child porn teaches the mind to go to children to fulfill sexual urges. Mindfulness practice has been shown to be effective in curbing urges in all forms of addiction.

      So, no. Just no to your whole post.

      There’s effective treatment for addictions, rather sexual or otherwise. Rather the addiction feeds on children or heroin. And we don’t need to see if fake child porn helps. Evidence already suggests it doesn’t and we already have effective treatments that don’t put children at risk and that don’t encourage the behavior.

      • Lowlee Kun
        link
        fedilink
        English
        171 year ago

        Not judging/voting your comment, do you have the data at hand? Just out of interest.

        Some input though, you are not making a difference between offenders and non-offenders and i doubt there is even good data on non offenders to begin with.

      • Forbo
        link
        fedilink
        English
        111 year ago

        As mentioned on another one of your comments, I am having a hard time finding the science you reference.

      • Pooptimist
        link
        fedilink
        English
        41 year ago

        This isn’t about addiction, it’s about sexuality. And you can’t just curb your whole sexuality away. These people have a disorder that makes them sexually attracted to children. At this point there is no harm done yet. They just are doomed to live a very unfulfilling life, because the people with whom they want to engage in sexual practices can’t give their consent, which is morally and legally required, no question about that. And most of them don’t give in to these urges and seek the help they need.

        But still, you can’t just meditate your whole sexuality away. I don’t want to assume, but I bet you also masturbate or pleasure yourself in one way or another, I know I do. And when I was young, fantasy was all I needed, but then I saw my first nude and watched my first porno and it progressed from there, and I’m sure fantasy won’t be enough for these people as well. So when they get to the stage where they want to consume media, I prefer it to be AI created images or some drawn hentai of a naked young girl or whatever, and not real abused children.

      • @lolcatnip@reddthat.com
        link
        fedilink
        English
        161 year ago

        Not child porn. AI produces images all the time of things that aren’t in its training set. That’s kind of the point of it.

        • @SuddenlyBlowGreen@lemmy.world
          link
          fedilink
          English
          -15
          edit-2
          1 year ago

          AI produces images all the time of things that aren’t in its training set.

          AI models learn statistical connections from the data it’s provided. It’s going to see connections we can’t, but it’s not going to create things that are not connected to its training data. The closer the connection, the better the result.

          It’s a pretty easy conclusion from that that CSAM material will be used to train such models, and since training requires lots of data, and new data to create different and better models…

          • @BetaDoggo_@lemmy.world
            link
            fedilink
            English
            21 year ago

            Real material is being used to train some models, but sugesting that it will encourage the creation of more “data” is silly. The amount required to finetune a model is tiny compared to the amount that is already known to exist. Just like how regular models haven’t driven people to create even more data to train on.

            • @SuddenlyBlowGreen@lemmy.world
              link
              fedilink
              English
              11 year ago

              Just like how regular models haven’t driven people to create even more data to train on.

              It has driven companies to try to get access to more data people generate to train the models on.

              Like chatGPT on copyrighted books, or google on emails, docs, etc.

              • @BetaDoggo_@lemmy.world
                link
                fedilink
                English
                21 year ago

                And what does that have to do with the production of csam? In the example given the data already existed, they’ve just been more aggressive about collecting it.

                • @SuddenlyBlowGreen@lemmy.world
                  link
                  fedilink
                  English
                  01 year ago

                  Well now in addition to regular pedos consuming CSAM, now there are the additional consumers of people to use huge datasets of them to train models.

                  If there is an increase in demand, the supply will increase as well.

                  • @BetaDoggo_@lemmy.world
                    link
                    fedilink
                    English
                    21 year ago

                    Not necessarily. The same images would be consumed by both groups, there’s no need for new data. This is exactly what artists are afraid of. Image generation increases supply dramatically without increasing demand. The amount of data required is also pretty negligible. Maybe a few thousand images.

    • Llamatron
      link
      fedilink
      English
      -36
      edit-2
      1 year ago

      Because eventually looking at images might not be enough

      Edit: Do we want to be normalising this? It’s disturbing how there’s people defending it.

        • Llamatron
          link
          fedilink
          English
          -171 year ago

          But you will at least have an outlet if you get yourself a partner or hire an escort. There’s the prospect of sex in real life. You’re not forever limited to porn.

          • Lowlee Kun
            link
            fedilink
            English
            191 year ago

            I did not have sex in years, yet luckely nobody thinks i am a danger to women. It is nearly as if people do not suddenly feel the need to rape someone just because they dont have sex.

            • JokeDeity
              link
              fedilink
              English
              71 year ago

              Slippery slope arguments almost exclusively come from the only people who they seem to affect. You see the same worrying mentality from religious people who tell you that without god they would be commiting serious crimes. Most people have inherent morality that these people seem to lack without strict legal or religious guidelines.

              • Lowlee Kun
                link
                fedilink
                English
                71 year ago

                I mean that makes sense i guess. I hate these “arguments” because it kills the debate. On the other hand i am totally not used to see any debate on this topic without it derailing into people calling each other pedos. So props to most people in here.

      • @Sir_Kevin@lemmy.dbzer0.com
        link
        fedilink
        English
        451 year ago

        By that logic almost everyone in Hollywood should be in prison for depicting violence, murder, rape etc in movies/shows etc. This argument was put to rest back in the '90s.

    • @PM_Your_Nudes_Please@lemmy.world
      link
      fedilink
      English
      -421 year ago

      While I don’t disagree with the initial premise, image AI requires training images.

      I suppose technically you could use hyper-realistic CGI CSAM, and then it could potentially be a “victimless” crime. But the chances of an AI being trained solely on CGI are basically non-existent. Photorealistic CGI is tough and takes a lot of time and skill to create from scratch. There are people whose entire careers are built upon photorealism, and their services aren’t cheap. And you’d probably need a team of artists (not just one artist, because the AI will inevitably end up learning whatever their “style” is and nothing more,) who are both capable and willing to create said images. The chances of all of those pieces falling into place are damned near 0.

      Maybe you could supplement the CGI with young-looking pornstar material? There are plenty of pornstars whose entire schtick is looking young. But they definitely don’t look like children because the proportions are obviously all wrong; Children have larger heads compared to their bodies, for example. That’s not something that an adult actress can emulate simply by being flat chested. So these supplemental images could just as easily end up polluting (for lack of a better word) your AI’s training, because it would just learn to spit out images of flat chested adult women.

      • db0
        link
        fedilink
        English
        331 year ago

        Generative Ai is perfectly capable of combining concepts. Teach it how do today do photorealistic underage and photorealistic porn and or can combine them together to make csam without ever being trained on actual csam

      • @captain_spork@startrek.website
        link
        fedilink
        English
        33
        edit-2
        1 year ago

        This is like telling someone to “stop liking rock music” or “stop enjoying ice cream.” People don’t decide what their preferences are, they just have them. If we can give pedophiles a way to release those urges without harming children that should be a good thing. Well not good, but positive in the relative sense at least.

          • Rikudou_SageA
            link
            English
            381 year ago

            That’s because you’re not very bright.

              • @PolarisFx@lemmy.dbzer0.com
                link
                fedilink
                English
                31 year ago

                It’s idiots like this that make me think of that story I saw the other day on lemmy. Where a disabled man was taking pictures of kids damaging his property, someone saw him and called the cops. Cops came, questioned him, found out what was up and released him, meanwhile morons in the neighborhood hear he was taking pictures of kids and got arrested, and that was enough for them to brutally beat and kill him that same night.

          • @papertowels@lemmy.one
            link
            fedilink
            English
            41 year ago

            But do you actually have a reply to the content of the comment? You’re essentially pulling a “smells like communism, it bad” instead of addressing the points brought up.

      • @photonic_sorcerer@lemmy.dbzer0.com
        link
        fedilink
        English
        211 year ago

        That’s like telling gay dudes to stop liking dick. It’s brain chemistry and neural circuits, you can’t exactly just snap your fingers and be rid of the problem. Humans are complex creatures.

      • @lolcatnip@reddthat.com
        link
        fedilink
        English
        81 year ago

        “Anyone who disagrees with me is a child rapist.” That’s the level of argumentation I expect from a child or a fascist.