A judge in Washington state has blocked video evidence that’s been “AI-enhanced” from being submitted in a triple murder trial. And that’s a good thing, given the fact that too many people seem to think applying an AI filter can give them access to secret visual data.

  • @abhibeckert@lemmy.world
    link
    fedilink
    English
    12
    edit-2
    8 months ago

    computers aren’t taking drugs and randomly pooping out images

    Sure, no drugs involved, but they are running a statistically proven random number generator and using that (along with non-random data) to generate the image.

    The result is this - ask for the same image, get two different images — similar, but clearly not the same person - sisters or cousins perhaps… but nowhere near usable as evidence in court:

    • @Gabu@lemmy.world
      link
      fedilink
      English
      -178 months ago

      Tell me you don’t know shit about AI without telling me you don’t know shit. You can easily reproduce the exact same image by defining the starting seed and constraining the network to a specific sequence of operations.

      • Natanael
        link
        fedilink
        English
        98 months ago

        But if you don’t do that then the ML engine doesn’t have the introspective capability to realize it failed to recreate an image

        • @Gabu@lemmy.world
          link
          fedilink
          English
          -118 months ago

          And if you take your eyes off of their sockets you can no longer see. That’s a meaningless statement.

          • @blind3rdeye@lemm.ee
            link
            fedilink
            English
            38 months ago

            The point is that the AI ‘enhanced’ photos have nice clear details that are randomly produced, and thus should not be relied on. Are you suggesting that we can work around that problem by choosing a random seed manually? Do you think that solves the problem?