In this case there are several crimes, but in the other one mentioned about a korean there is nothing, only possession of generated content arguing that there is high realism (someone could say the same even of a sketch). To imprison for acts that have neither victims nor any harm either directly or indirectly, is more aberrant than possessing that.
PS: I’m just talking about legality and rights, I know it’s controversial and I’m sure someone has something to argue against it, but if you’re going to accuse me of being a pedo just get lost you moron.
People getting way overexcited about AI at the moment. If a crime or perceived crime even remotely is related to AI it becomes the main focus.
Like the person who was hit by a self-driving car, the case was really about a hit and run drive that it hit the pedestrian first and throwing them into the self-driving car. Have the self-driving car not been there and it had been a human driver pretty much the same thing would have happened but they focus on the AI aspect.
If I used an AI to commit fraud it was me that committed the fraud not the AI but you can be damn sight certain that people would get hung up on that aspect of the case and not the me committing a crime bit.
It’s the same as when Ford invented the transit van (I have no idea what the equivalent in the US market was). It was faster than most cars at the time, could carry heavier loads, and was physically larger. Inevitably it got used in a lot of bank robberies because the police literally couldn’t keep up with it. And people started talking about maybe having a performance limit on vehicles, when really the actual solution was that everyone else just needed better cars. If they had actually implemented a performance limit, they would have held us back.
I thought it was obvious but ok I’ll explain it to you. The story isn’t really about AI, it involves an AI but really that’s got absolutely nothing to do with the crime that was happening, so why we obsessing over it?
The guy committed a crime. And also as a separate event he used AI.
The AI did not enable him to commit the crime, the AI did not make the crime worse, the AI did not make the crime possible, and he did not use the AI to plan the crime. The use of the AI was entirely incidental to the crime.
This piece controversial, but evocative, thought-provoking and says something about an innocent time in our youth and the change of demeanor sexuality brings when we become aware.
People may not like this, but if you can separate sexuality and understand that we were once “innocent” - meaning sex wasn’t something we knew about, we just had these bodies we were told to hide in clothes, the painting takes on a whole new meaning.
I’m not advocating for fake cheese pizza photos, fuck those sickos, but art can appear to be one thing on first glance and then take on a new meaning as we study and understand more.
Your first passage about criminalizing art is 100% correct and 100% irrelevant. You cannot call porn art. Porn with adults, children, dogs, pumpkins - all that stuff is made for people to get off, not enjoy the emotions that real art provokes in people. Therefore we cannot compare criminalizing porn with criminalizing art.
There are edge cases, of course, when art might be provocative and considered immoral, and maybe even illegal sometimes. But that would be edge cases, highly debated.
Dont they often train the program with adult porn and then the ai just puts a childs face onto bodies generated from this training data? I imagine these ai companies are scraping data from popular pornsites or just paying for the data and these pornsites work hard not to have CP on them. The result is a childs face on a body too mature for it. Remember that some actual adult actresses have body proportions that many would consider underdeveloped and someone generating these pictures could regenerate till the ai uses these body proportions.
The point being is you don’t need CP to train ai to make CP. I am not justifying any moral positions here, but pointing out a fact in ai technology.
In this case the guy did have real images but you don’t need them. AI is kind of intelligent in a sort of hard to define way, it picks up on stuff.
It picked up that people like younger individuals in pornography, so it took that to the logical extreme. AI is weird because it’s intelligence without any actual thought. But it can totally generate variations on things it’s already seen, and a kid is just a variation on a young adult.
Criminalizing the creation, possession, or viewing of entirely artificial artwork is beyond unethical; it’s extraordinarily evil.
No it isn’t.
I don’t care if you find someone’s artwork gross, troubling, distasteful, immoral, etc… that’s art.
No, it’s child porn.
Careful, any time I point this out, the fascists come out of the woodwork to call me a pedo.
Can’t imagine why.
You realise the AI is being trained on pictures of real children, right?
So it’s wrong for it to be based on one child, but according to you the AI “art” (as you keep calling it) is okay as long as there are thousands of victims instead?
So you’re cool with images of 6 year olds being penetrated by a 40 year old as long as “tHe Ai DrEw iT sO nObOdY gOt HuRt”? I guess you could just set it as your desktop and phone wallpaper and everything would be fine. Let me know how that works out for you.
That’s some stunning mental gymnastics right there.
That’s literally the whole point I am making: It doesn’t matter how I feel about it, it doesn’t matter how YOU feel about it. It’s not real. Neither you nor I nor anyone else has the right to judge someone else’s art.
It does matter how myself and wider society view disgusting content. It matters a lot. And society absolutely has a say of it’s acceptance or otherwise to such content. You saying otherwise is absurd.
In the same way that I can’t and shouldn’t write something incrediblely racist and pretend it’s ‘art’. Even if AI made it.
Attempting to give AI child porn a pass, as you are doing for some baffling reason, absolutely will create further harm further down the line.
I’d say it’s because the person you’re replying to rightfully sees it as a slippery slope. If you say this fake image that didn’t directly harm anyone is illegal, what’s to stop you from saying some other fake image that’s much more in line with social tastes is also illegal? Ie an artwork made of human shit, for example. Most people would be repulsed by that. But it doesn’t change the fact that it could be art. As long as it doesn’t concretely harm someone, it’s hard to equate it to said harm.
I know you know this, but you are not crazy. I’m astonished you are being down voted so hard. The pedo apology is so strong it’s making me not want to use Lemmy. This thread is worse than reddit.
I left when the API price changes kicked in and at first Lemmy was alright, then the extremists turned up and the echo chamber in here is so ridiculous that there just isn’t much point in being here.
Not just the pedo apologists (next step will be AI CP actually being posted here and people defending it as “art”), but also seeing that YouTube is trying to stop freeloaders leeching from it and somehow that’s evil literally every single day and seeing how evil cars are literally every single day and seeing how Linux is the next coming of Jesus literally every single day (and I say that as a 20+ year Linux user) is incredibly tedious.
Sure, this existed on Reddit as well, but at least there was actually other content to dilute it and for the most part people were reasonable instead of the rabid extremism I’m seeing every day here. There is no way in hell I would have seen the up/downvote ratio like I’m seeing in this pedo apologist conversation on Reddit.
You realise the AI is being trained on pictures of real children, right?
Can you share a source? Just like how people utilize the internet to distribute CP, there are undoubtedly circles where people are using ml for CP. However my understanding is that by and large, popular models are not intentionally trained on any.
The pedofiles that are smart enough to not get caught and use technology like tor and encrypt everything and can figure out how to use stable diffusion will be the pedofiles that have custom models trained on real children.
And if you and me consider the possibility in a casual conversation online, they have also considered the possibility, heavily researched and implemented it if it’s at all possible. And they know how to not get caught.
But it’s okay, it’s “art” after all and we can’t ban art because that’s evil… Right… Right?
…okay, seeing as you haven’t actually done any research, yet arrived at a conclusion, a conversation about this is going to be difficult.
Let’s get more specific so we can have an actual conversation. When you say “the AI”, what do you mean? Dall-e, midjourney, or some guy training and using their own model on a local computer?
Are you familiar with large models being able to compose concepts they’ve seen, to produce something not found in its training data?
You used a technical assertion in your argument. Out of curiosity, I wanted to learn more and asked you for sources.
You can neither prove nor are you capable of discussing said technical assertion. I am now going to leave the conversation. Seeing as you can’t prove or even discuss it, I’d hope you avoid using it in the future, or at least learn more about it.
however, I think psychologists might not be a fan of giving them access to that material. I think the reason is because they would end up looking fore more and more extreme material and they could end up offending as a result of that
Afaik we’re still yet to find out whether viewing AI-generated material makes an individual look for real-life child abuse imagery.
I believe viewing the latter allows many to keep real-life urges under control (might re-check materials on that), but it obviously comes with its own issues. If we can make AI generated child pornography, and if it doesn’t make people go look for the “real stuff”, we might actually make a very positive impact on child safety.
According to the few studies we have in the nineties and aughts most people who have sexual attractions to kids are aware acting on them can be harmful and will turn to alternative ways to explore them (when they can’t be suppressed or redirected.) So yes, now we have victimless ways to produce porn, the objections are to the fetishes themselves, not to resulting violent behavior.
That said people commonly and openly express their distaste for such people, more so than domestic violence offenders who assault their kids, just not sexually. The same general disdain for child-sex-attracted adults does not transfer to action to feed children or protect them from abusive households.
That said, when we’ve worried about fiction driving people to act it out in reality, historically this has been demonstrated wrong every single time. Women are not driven to scandal and harlot behavior from trashy romance. Teens are not driven to violence from violent movies or video games. We can expect porn featuring childreb is not going to compell someone to seek to actually sex-assault kids.
In this case there are several crimes, but in the other one mentioned about a korean there is nothing, only possession of generated content arguing that there is high realism (someone could say the same even of a sketch). To imprison for acts that have neither victims nor any harm either directly or indirectly, is more aberrant than possessing that.
PS: I’m just talking about legality and rights, I know it’s controversial and I’m sure someone has something to argue against it, but if you’re going to accuse me of being a pedo just get lost you moron.
Removed by mod
People getting way overexcited about AI at the moment. If a crime or perceived crime even remotely is related to AI it becomes the main focus.
Like the person who was hit by a self-driving car, the case was really about a hit and run drive that it hit the pedestrian first and throwing them into the self-driving car. Have the self-driving car not been there and it had been a human driver pretty much the same thing would have happened but they focus on the AI aspect.
If I used an AI to commit fraud it was me that committed the fraud not the AI but you can be damn sight certain that people would get hung up on that aspect of the case and not the me committing a crime bit.
It’s the same as when Ford invented the transit van (I have no idea what the equivalent in the US market was). It was faster than most cars at the time, could carry heavier loads, and was physically larger. Inevitably it got used in a lot of bank robberies because the police literally couldn’t keep up with it. And people started talking about maybe having a performance limit on vehicles, when really the actual solution was that everyone else just needed better cars. If they had actually implemented a performance limit, they would have held us back.
deleted by creator
I thought it was obvious but ok I’ll explain it to you. The story isn’t really about AI, it involves an AI but really that’s got absolutely nothing to do with the crime that was happening, so why we obsessing over it?
The guy committed a crime. And also as a separate event he used AI.
The AI did not enable him to commit the crime, the AI did not make the crime worse, the AI did not make the crime possible, and he did not use the AI to plan the crime. The use of the AI was entirely incidental to the crime.
Is it really fascists doing that? Literal fascists? I don’t meet many of them in my daily life.
Lucky you!
Just tell them you support a two state solution and a cease fire, then they’ll become apparent to you.
Removed by mod
Leave the house occasionally!
If you have AI pornography of children, regardless of there being no real victim- you’re a fucking pedo.
Period. End of argument.
Get help.
It’s basically the same as drawing it. I think most countries legislate against this already
Here’s a piece of art by Balthus. It’s of a young girl in a skirt, leg hiked up and you can see her underpants: https://www.wikiart.org/en/balthus/thérèse-dreaming-1938
This piece controversial, but evocative, thought-provoking and says something about an innocent time in our youth and the change of demeanor sexuality brings when we become aware.
People may not like this, but if you can separate sexuality and understand that we were once “innocent” - meaning sex wasn’t something we knew about, we just had these bodies we were told to hide in clothes, the painting takes on a whole new meaning.
I’m not advocating for fake cheese pizza photos, fuck those sickos, but art can appear to be one thing on first glance and then take on a new meaning as we study and understand more.
Yeah I’m not clicking on that.
It’s a great painting!
Attempting to normalize and destigmatize representations of child sexual abuse by calling it art is extraordinarily evil.
Like Siesta by Arthur Berzinsh? It’s childsimilar cherubs playing with actions extremely close to eproctophilia with an adult woman
Im not even going to ask what that is.
Your first passage about criminalizing art is 100% correct and 100% irrelevant. You cannot call porn art. Porn with adults, children, dogs, pumpkins - all that stuff is made for people to get off, not enjoy the emotions that real art provokes in people. Therefore we cannot compare criminalizing porn with criminalizing art.
There are edge cases, of course, when art might be provocative and considered immoral, and maybe even illegal sometimes. But that would be edge cases, highly debated.
Removed by mod
Dont they often train the program with adult porn and then the ai just puts a childs face onto bodies generated from this training data? I imagine these ai companies are scraping data from popular pornsites or just paying for the data and these pornsites work hard not to have CP on them. The result is a childs face on a body too mature for it. Remember that some actual adult actresses have body proportions that many would consider underdeveloped and someone generating these pictures could regenerate till the ai uses these body proportions.
The point being is you don’t need CP to train ai to make CP. I am not justifying any moral positions here, but pointing out a fact in ai technology.
Removed by mod
You don’t know what you’re talking about.
In this case the guy did have real images but you don’t need them. AI is kind of intelligent in a sort of hard to define way, it picks up on stuff.
It picked up that people like younger individuals in pornography, so it took that to the logical extreme. AI is weird because it’s intelligence without any actual thought. But it can totally generate variations on things it’s already seen, and a kid is just a variation on a young adult.
Removed by mod
deleted by creator
Yes, AI can create tons of content it’s not trained on.
deleted by creator
No it isn’t.
No, it’s child porn.
Can’t imagine why.
You realise the AI is being trained on pictures of real children, right?
So it’s wrong for it to be based on one child, but according to you the AI “art” (as you keep calling it) is okay as long as there are thousands of victims instead?
So you’re cool with images of 6 year olds being penetrated by a 40 year old as long as “tHe Ai DrEw iT sO nObOdY gOt HuRt”? I guess you could just set it as your desktop and phone wallpaper and everything would be fine. Let me know how that works out for you.
That’s some stunning mental gymnastics right there.
Removed by mod
It’s not art you pedo. Gtfo
It does matter how myself and wider society view disgusting content. It matters a lot. And society absolutely has a say of it’s acceptance or otherwise to such content. You saying otherwise is absurd.
In the same way that I can’t and shouldn’t write something incrediblely racist and pretend it’s ‘art’. Even if AI made it.
Attempting to give AI child porn a pass, as you are doing for some baffling reason, absolutely will create further harm further down the line.
I’d say it’s because the person you’re replying to rightfully sees it as a slippery slope. If you say this fake image that didn’t directly harm anyone is illegal, what’s to stop you from saying some other fake image that’s much more in line with social tastes is also illegal? Ie an artwork made of human shit, for example. Most people would be repulsed by that. But it doesn’t change the fact that it could be art. As long as it doesn’t concretely harm someone, it’s hard to equate it to said harm.
It’s child porn.
Child. Pornography.
It is not “Art”.
The slippery slope is people like you confusing the two and trying to somehow justify CP as free speech/art.
I don’t care how it is made. There is a line. This crosses it. Simple as that.
Removed by mod
You don’t need CP to get ai to make CP. Please educate yourself on ai technology.
deleted by creator
Removed by mod
I know you know this, but you are not crazy. I’m astonished you are being down voted so hard. The pedo apology is so strong it’s making me not want to use Lemmy. This thread is worse than reddit.
Terrifying.
Indeed, it’s making me want to go back to Reddit.
I left when the API price changes kicked in and at first Lemmy was alright, then the extremists turned up and the echo chamber in here is so ridiculous that there just isn’t much point in being here.
Not just the pedo apologists (next step will be AI CP actually being posted here and people defending it as “art”), but also seeing that YouTube is trying to stop freeloaders leeching from it and somehow that’s evil literally every single day and seeing how evil cars are literally every single day and seeing how Linux is the next coming of Jesus literally every single day (and I say that as a 20+ year Linux user) is incredibly tedious.
Sure, this existed on Reddit as well, but at least there was actually other content to dilute it and for the most part people were reasonable instead of the rabid extremism I’m seeing every day here. There is no way in hell I would have seen the up/downvote ratio like I’m seeing in this pedo apologist conversation on Reddit.
Maybe it’s time to go back.
Pity. Oh well.
Can you share a source? Just like how people utilize the internet to distribute CP, there are undoubtedly circles where people are using ml for CP. However my understanding is that by and large, popular models are not intentionally trained on any.
I am categorically not researching that.
Put it this way…
The pedofiles that are smart enough to not get caught and use technology like tor and encrypt everything and can figure out how to use stable diffusion will be the pedofiles that have custom models trained on real children.
And if you and me consider the possibility in a casual conversation online, they have also considered the possibility, heavily researched and implemented it if it’s at all possible. And they know how to not get caught.
But it’s okay, it’s “art” after all and we can’t ban art because that’s evil… Right… Right?
…okay, seeing as you haven’t actually done any research, yet arrived at a conclusion, a conversation about this is going to be difficult.
Let’s get more specific so we can have an actual conversation. When you say “the AI”, what do you mean? Dall-e, midjourney, or some guy training and using their own model on a local computer?
Are you familiar with large models being able to compose concepts they’ve seen, to produce something not found in its training data?
What on earth makes you think I wish to have an extended conversation about this?
Child porn is not art. Even if AI made it.
Banning child porn is not immoral or evil.
Simple as that.
If you cannot accept that basic premise then I have nothing to say to you.
I have said literally nothing about ethics.
You used a technical assertion in your argument. Out of curiosity, I wanted to learn more and asked you for sources.
You can neither prove nor are you capable of discussing said technical assertion. I am now going to leave the conversation. Seeing as you can’t prove or even discuss it, I’d hope you avoid using it in the future, or at least learn more about it.
I agree with what you are saying.
however, I think psychologists might not be a fan of giving them access to that material. I think the reason is because they would end up looking fore more and more extreme material and they could end up offending as a result of that
Afaik we’re still yet to find out whether viewing AI-generated material makes an individual look for real-life child abuse imagery.
I believe viewing the latter allows many to keep real-life urges under control (might re-check materials on that), but it obviously comes with its own issues. If we can make AI generated child pornography, and if it doesn’t make people go look for the “real stuff”, we might actually make a very positive impact on child safety.
According to the few studies we have in the nineties and aughts most people who have sexual attractions to kids are aware acting on them can be harmful and will turn to alternative ways to explore them (when they can’t be suppressed or redirected.) So yes, now we have victimless ways to produce porn, the objections are to the fetishes themselves, not to resulting violent behavior.
That said people commonly and openly express their distaste for such people, more so than domestic violence offenders who assault their kids, just not sexually. The same general disdain for child-sex-attracted adults does not transfer to action to feed children or protect them from abusive households.
That said, when we’ve worried about fiction driving people to act it out in reality, historically this has been demonstrated wrong every single time. Women are not driven to scandal and harlot behavior from trashy romance. Teens are not driven to violence from violent movies or video games. We can expect porn featuring childreb is not going to compell someone to seek to actually sex-assault kids.
deleted by creator
It would take a lot of shots to make a meaningful change in the database.
It would probably require training on existing data, which by itself is questionable, but I lean to the side that it might be worth it for the cause.