A sex offender convicted of making more than 1,000 indecent images of children has been banned from using any “AI creating tools” for the next five years in the first known case of its kind.

Anthony Dover, 48, was ordered by a UK court “not to use, visit or access” artificial intelligence generation tools without the prior permission of police as a condition of a sexual harm prevention order imposed in February.

The ban prohibits him from using tools such as text-to-image generators, which can make lifelike pictures based on a written command, and “nudifying” websites used to make explicit “deepfakes”.

Dover, who was given a community order and £200 fine, has also been explicitly ordered not to use Stable Diffusion software, which has reportedly been exploited by paedophiles to create hyper-realistic child sexual abuse material, according to records from a sentencing hearing at Poole magistrates court.

  • @Doof@lemmy.world
    link
    fedilink
    English
    37 months ago

    What do you know of the current methods. Where did this information come from? I’d really like to see it. You spoke with such knowledge, you must have the data to make it up, right?

    • @Allero@lemmy.today
      link
      fedilink
      English
      37 months ago

      The approach was originally pioneered by the Prevention Project Dunkelfeld, and later adopted for a wider use in Germany, Europe and abroad.

      Studies have shown that this approach does work, which led to its widespread adoption and popularization.

      You can read details of the treatments coming out of this research here.

      Beware of the corporate greed and prepare good old Sci-hub to read sources in full text if you want to.

      • @WormFood@lemmy.world
        link
        fedilink
        English
        07 months ago

        i read three of the sources you provided (all of them, except the book), and the only thing you’ve said which is true is that the treatment ‘includes acceptance of their desires’ (though you have added the words ‘as normal’)

        the other two claims you’ve made, including ‘it does not prohibit any fictional materials including children’ and ‘by stripping away safe outlets we may come at risk of these people increasingly turning to real CSAM’ are your own inventions, and are not stated anywhere in the texts you have linked, in fact, they are directly refuted by both of them, because the actual prevention project recommends a combination of cognitive behavioral therapy and medication

        • @Allero@lemmy.today
          link
          fedilink
          English
          3
          edit-2
          7 months ago

          I specifically addressed the “current methods” part of it, as questioned.

          The second point was beyond the scope of the sources I provided, except maybe the book, but the project is in line with this as well - it does not focus on the fictional materials and does not explicitly prohibit them. It doesn’t encourage the consumption of such materials, either, so the position can be best described as “neutral”. It does, however, strongly object real CSAM.

          The latter was answered in another thread - yes, you are right about this being my speculation, as the scientific community, for all I know, currently doesn’t have data to either prove or disprove this point. But that seems likely to me.