- cross-posted to:
- globalnews@lemmy.zip
- cross-posted to:
- globalnews@lemmy.zip
Teen boys use AI to make fake nudes of classmates, sparking police probe::Parents told the high school “believed” the deepfake nudes were deleted.
“…and their principal, Asfendis, has vowed to raise awareness on campus of how to use new technologies responsibly.”
Surely all the teenage boys will understand, and only use the technology for wholesome purposes.
D.A.R.E. raised my awareness of drugs. I only used them for wholesome purposes.
Yup, put some right into my holes.
Responsibly…!
They might know it’s bad but not fully understand the potential harms. I made another comment on it
There is absolutely no way anyone could have possibly seen this coming.
How do you stop this tho
You don’t. Scissors and Polaroid and Playboy have been around for decades. If you wanted to see your classmates face on a nude and photocopy it, you could.
Now it’s just easier and more believable. But it’s not any more stoppable.
Tbf the ease of creating multiple realistic images quickly along with the ability to rapidly share those images is a bit different than cut ‘n paste a completely unmatched head and torso.
The point is the same. Not your body just your head.
It was harder for minors to get porn back then.
I grew up “back then” (when VHS was new technology) and it really wasn’t very hard to get some sort of porn.
Even as a minor? I grew up way later than that and the only way we’d get porn before the internet really kicked off was to get lucky and find an old mag in a bush.
I grew up with internet porn, but it was also during the transition so I had magazines as a minor also. It was just as easy to have some, the only difference now is the volume that’s easy to access. If people had physical porn, it had to be physically hidden, and could then be physically found. And if you’re always buying more, it gets harder and harder to both hide it and keep track of what you have so that you’ll notice some missing.
Though my first memories of porn are from going to a corner store style store in a mall and just looking through the porn magazines they had on display. A lot of the employees running the stores didn’t give a shit. Maybe they wouldn’t have sold them to me, but they either didn’t even notice or just didn’t say anything when I was looking at porn before I even knew about jerking off.
Even as a minor?
Yup. Neighbour’s kid’s dad kept his Hustler stash in their garden shed in a crate behind the lawn mower.
Yh, I don’t like that my generation was the first to be exposed to freely available and copious amounts of porn. But on the flip side the internet is sooooooo useful and I would not endorse any government saying what should or shouldn’t be on the internet, the internet should be free and censorship would be a very slippery slope.
It’s a tough moral debate and I’m really not sure what the answer is.
In general, I’m against “censorship.” There’s also no reason why we should allow non consensual porn to be circulated. Abolishing online porn entirely would instantly solve the question of “revenge porn” and whether consent was obtained.
There’s a certain amount of logic to the idea that you should allow bad people to feel free to express themselves so they can be identified, but I don’t think that fully holds up with porn and can lead to women being harmed. Girls who are 18-22 aren’t in a position to fully resist the temptation being offered by a lot of sex work possibilities. They haven’t learned about money yet and getting offered “a lot” in the short term is going to be too hard for them to weigh against selling out their future at that age. For every “empowered sex worker” out there who makes a good living and really wants to be in that work, there are many more who were exploited or got into it because of mental illness or trauma. Commodification of sex is ultimately a feature of the capitalist system.
Then wtf is the point of saying “No OnE cOuLd HaVe SeEn ThIs CoMiNg”
woosh?
There might be a misunderstanding. I understand the original post is trying to say that it was obvious problems like this will occur with the introduction of AI generated images but it also implies an easy or obvious solution. But there isn’t one, so what is the point of pointing this out.
I don’t read it as saying that there may be a simple solution? And I don’t know how to attack the problem other that maybe a posable threat of distribution of material that could be classed as CSAM
Maybe an analogy would help clear this whole thread up. Let’s say you wake up tomorrow and you see headlines of scientists discovering a meteor that will hit earth in the next 48 hours. Then a couple of days later you read a meteor hit earth causing X deaths and Y billions of dollars in damages. Then you go to the comment section and read “There is absolutely no way anyone could have possibly seen this coming.” So then you’re thinking to yourself does this comment seem a bit weird or am I just dumb for missing something. So you ask “could this have been prevented somehow” (subtext you don’t really see anything obvious) but then you get confirmation it could not have been prevented so now you’re just like “wait then wtf was the original comment saying”.
And that is how I feel right now lmao.
Clicks
Go back to using natural intelligence and try render with brain. Images can’t be shared.
Brain sends data to hand and hand render it with pen and paper, what now?
feet carry drawing to photocopier
This reminds me of the funny picture about a black person being angry that white people can think of slurs and there’s nothing that can be done about it
The other comment about how this has been happening for a long time (with low tech methods) is true, and it’s also true that we can’t stop this completely. We can still respond to it:
An immediate and easy focus would be on what they do with the images. Sharing them around is still harassment / bullying and it should be dealt with in the same way as it currently is.
There’s also an education aspect to it. In the past, those images (magazines, photocopies, photoshop) would be limited in who sees them. The kids now are likely using free online tools that aren’t private or secure, and those images could stick around forever. So it could be good to highlight that
- Your friends and classmates may see them, and it may harm their lives. The images will likely stick around. Facial recognition algorithms are also improving, so it’s a legitimate concern that an image stored on a random site somewhere will be tied back to them.
- The images can be traced back to the creator and the creator can face repercussions for it (for those without empathy, this might be the better selling point
Your point 1 seems to forget something important: kids are often cruel, and bullying is frequently the point. So long term consequences for their classmates can be an incentive more than a deterrent.
To your first point, much to the benefit of humanity, and counter to popular belief, the internet is NOT forever. Between link rot, data purges, corporate buyouts, transmission compression losses, and general human stupidity, large swaths of the internet have vanished. Hell, just Macromedia selling out to Adobe ended up causing the loss of most of the popular internet games and videos for anyone in their mid to late 30s at this point (you will be misses Flash). The odds of these specific AI-generated child porn pictures surviving even in some dark corner of the bright web are slim to none. And if they end up surviving in the dark web, well, anyone who sees them will likely have a LOT of explaining to do.
Also, for the commentary of the websites keeping the images. That is doubtful, beyond holding them in an account-bound locker for the user to retrieve. They don’t care and too many images get generated every day for them to see it as more than reinforcement training.
Speaking of reinforcement training, they may have been able to use Photoshop’s new generative fill to do this, but to actually generate fresh images of a specific peer they would have had to train a LoRA or Hypernerwork on photos of the girl so the SD could actually resolve it. They weren’t doing that on an AI site, especially not a free one. They were probably using ComfyUI or Automatic1111 (I use both myself). They are free, open source, locally executed software that allow you to use the aforementioned tools when generating. That means that the images were restricted to their local machine, then transferred to a cell phone and distributed to friends.
I think we should pressure EU to make it such that any online AI photo generating website also uses AI to make sure what was asked is not illegal.
My niece had this same issue a few years ago but with Photoshop. It absolutely ruined her. Changed schools multiple times (public and private) but social media exists so all the kids knew. She ended up getting homeschooled for the last 5 years of school as well as a fuckload of therapy. She came out the other side okay but she has massive trust issues and anxiety
Man that’s awful, poor thing.
Ai porn to come w this disclaimer:
In Spain it happened recently with some 12y/olds…it created a country-wide debate, and as always, did not lead to any regulation. Hopefully the EU will do something
This can be prosecuted with existing CP laws.
Wait When did Spain join the EU?
in '86
We are a full member, we have the Euro and we’re in schengen
Did you just wake up from a 40 years long hangover ? Welcome mate
Yes
It’s harder to remember what countries aren’t or aren’t trying to be in the EU…
A couple months before the Chernobyl power plant disaster.
This is the best summary I could come up with:
This October, boys at Westfield High School in New Jersey started acting “weird,” the Wall Street Journal reported.
It took four days before the school found out that the boys had been using AI image generators to create and share fake nude photos of female classmates.
Biden asked the secretary of Commerce, the secretary of Homeland Security, and the heads of other appropriate agencies to provide recommendations regarding “testing and safeguards against” producing “child sexual abuse material” and “non-consensual intimate imagery of real individuals (including intimate digital depictions of the body or body parts of an identifiable individual), for generative AI.”
“New York State currently lacks the adequate criminal statutes to protect victims of ‘deepfake’ pornography, both adults and children,” Donnelly said.
Until laws are strengthened, Bramnick has asked the Union County prosecutor to find out what happened at Westfield High School, and state police are still investigating.
Until the matter is settled in the New Jersey town, the girls plan to keep advocating for victims, and their principal, Asfendis, has vowed to raise awareness on campus of how to use new technologies responsibly.
The original article contains 950 words, the summary contains 184 words. Saved 81%. I’m a bot and I’m open source!
Pictures? We are on the edge of believable videos with AI produced voices and sounds - made on normal computers. Need to clear a few more hurdles in 3D AI modeling, VR, and haptic feedback before this trend reaches it’s obvious conclusion.
Wonder what crime it would be called if you create a haptic VR double of someone unconsensually and don’t distribute it?
Haptics are never going to be like in Ready Player One. It’s crazy to me that anyone believes the tech will be capable of that. Like how diminished is one’s sense of touch that one could believe it could be fooled by fancy rumble packs? Touch is so much more complex than that. Piezoelectric motors vibrating are not going to be able to be able to fake solidity. Nuts to me people think that.
Might be possible with big gel tanks that people get submerged in, so the gel would be somehow hardened or softened with precise and weak electric currents, emulating textures.
But imo, it’s more likely that it’ll happen through some brain interface and the whole experience will basically be a very lucid dream.
Lots of time until that though, unless we destroy ourselves first. At least I doubt it’ll happen during my lifetime.
It seems more likely to happen through a brain interface, but also I’m increasingly skeptical that will ever be possible. Optimistic estimates for a full brain interface are a century plus, just by judging at the number of direct neuron measures we currently have and applying a (optimistic) Moore’s law style exponential curve: https://waitbutwhy.com/2017/04/neuralink.html
Vr is real fun but it’s fundamentally just another display technology. It’s less “ready player one” and more “what 3d TVs promised and failed to be”.
True that. I just can’t wait until there are full headsets that are as small as glasses with wireless data transmission to my PC. There are at least a few companies that are coming closer every year, like Meta. Not a big fan of Meta though.
They’re probably never going to be as small as glasses just due to hard physical limits in optics.
Maybe. I like to think that it’s just a matter of time though.
Like how diminished is one’s sense of touch that one could believe it could be fooled by fancy rumble packs?
Have you ever used a macbook trackpad? The click is just a fancy rumble pack. We can use electricity to make glass opaque. If the only thing stopping a person from living in a VR pod is haptic feedback, it’ll be solved in a fortnight.
Then why hasn’t it been solved? It’s been nearly a decade since the oculus sdk came out.
And if you think the max track pad haptics are indistinguishable from a real button click, you’re… Not very perceptive imo. Don’t mean that as an attack. Just open your mind to the idea that other people can def tell the difference.
Thanks authorities, for taking seriously the safety of our students in the classroom. I also saw a kid eat glue.
Given that AI images and media can’t be copyrighted, does the nominal “subject” have any recourse?
Not sure about other places, but here in Brazil creating a fake nude of somebody and distributing it would be illegal
In the US a photoshopped nude would be copyrightable. But courts here have said that AI generated content doesn’t get the benefits of copyright.
Don’t only think of copyright. People don’t copyright CSAM, but they go to jail for making/distributing it.
That’s a good point.
Not being able to apply for copyright doesn’t prevent you from getting charged for infringement.
“I made it with AI, it’s not copyrightable” is the 2023 version of uploading a show to Youtube and adding in the details “I do not claim Copyright on this material. All Copyrights belong to their respective holders”. It’s still illegal even if you don’t claim to own it.
There’s the matter of consent, and it might legally be along the same lines of giving someone a roofie so they don’t remember in the morning.
That just means the person that used AI to make something can’t claim those rights for the generated content – other laws still apply.
Everyone else still retain rights to their likeness in most places, and I’d imagine that still stands in this case.
Historically, I know that a big way that the dissemination of these sort of images was stopped was by using copyright law (because they’re using the likeness of the subject). I’m worried how that will work if there’s no copyright law to fall back in.
If you’re making porn of real underage people, I have no problem with you being put on the pedo registry.
If no serious harm was done, I’m fine with convicting them and then doing full expungement after 5-10 years.
And youre proof that the pedo registry shouldnt exist as is.
Teenagers being sexually interested in their peers is not pedophilia, and you want to ruin a decade of their life guaranteed, with the “”“”““promise””“”“”" of an expungement that would never actually happen thanks to the permanent nature of the internet for it.
This misuse of AI is a crime and should be punished and deterred, obviously. But labeling children about to enter the world as pedophiles basically for the rest of their lives?
Youre kind of a monster.
What about the fact that the girls who are victims of something like this will have to contend with the pictures being online if someone posts them there? What if people who don’t know that the pictures depict minors re-post them to other sites, making them very difficult to remove? That can cause very serious employablity problems. It doesn’t matter how open minded people are, they don’t want porn coming up if someone googles one of their employees.
The creation is still a crime, no one said otherwise.
It is just not an act of pedophilia.
If you produce CP, you should be on a registry for producing and distributing CP. If you create CP, you are enabling pedophilia.
They are children. Being horny about classmates.
Being sexually aroused by people your own age and wishing to fantasize about it is not enabling pedophilia, you literal psychopath.
Circulating porn of minors is a crime and enables pedophiles. Not to mention teenage girls could easily commit suicide over something like this.
So does yearbook and any other kind of photos that depict children for that matter
You can’t keep pushing the goal posts, by your logic young people should never date or take photos together because it could enable pedophiles somewhere somehow
These are children with brains still in development, they are discovering themselves and you want to label them forever a pedophile because they didn’t make a conscious effort to research how their spanking material could potentially enable a pedo (because we all know pedos can only be enabled by things produced by kids… yeah that’s the real threat)
Instead of suggesting a way to help the victims you are advocating for the creation of yet more victims
What a pathetic brain dead stance you are defending
Abuse and bullying of their classmates is just ‘discovering themselves’? Discovering that they’re psychopathic little mysoginists I guess. Their ‘spanking material’ was created in order to demean and huumiliate their victims. There’s plenty of porn online and absolutely no need for them to do this. If you actuslly wanted to help the victims you would not be trivialising and excusing this behaviour as ‘being horny about classmates’.
A yearbook photo is not porn.
And an AI image with a face photoshopped over it isnt a photo of a child.
And a teen being sexually interested in other teens isnt a pedophile.
I’d argue that someone making porn of someone their own age is not pedophilia.
They’re still making porn of a minor. That is harmful to them and it enables any pedophiles who find it.
That’s an easy enough judgement when the perpetrator is an adult. What do you do when the perpetrator is a minor themselves? As they are in this article.
Of course their still needs to be some sort of recourse, but for every other crime there is a difference between being tried as a child or being tried as an adult.
I find it tough to consider myself.
Considering the consequences for a high school student if porn of them gets circulated, I’m fine with putting them on the registry. Expungement can happen later based on the aftermath. Teenage girls have killed themselves over this sort of thing.