• @0ddysseus@lemmy.world
    link
    fedilink
    English
    41 year ago

    (Apologies if I use the wrong terminology here, I’m not an AI expert, just have a fact to share)

    The really fucked part is that at least google has scraped a whole lot of CSAM as well as things like ISIS execution bids etc and they have all this stuff stored and use it to do things like train the algorithms for AIs. They refuse to delete this material as they claim that they just find the stuff and aren’t responsible for what it is.

    Getting an AI image generator to produce CSAM means it knows what to show. So why is the individual in jail and not the tech bros?

    • @diffuselight@lemmy.world
      link
      fedilink
      English
      32
      edit-2
      1 year ago

      That’s a fundamental misunderstanding of how diffusion models work. These models extract concepts and can effortlessly combine them to new images.

      If it learns woman + crown = queen

      and queen - woman + man = king

      it is able to combine any such concept together

      As Stability has noted. any model that has the concept of naked and the concept of child in it can be used like this. They tried to remove naked for Stable Diffusion 2 and nobody used it.

      Nobody trained these models on CSAM and the problem is a dilemma in the same way a knife is a dilemma. We all know a malicious person can use a knife for murder, including of children Yet society has decided that knives sufficient other uses that we still allow their sale pretty much everywhere.