I was wondering if there’s any Blockchain project that could be integrated with and fingerprint all GenAI content. Every Blockchain token node could be linked to an image, or video produced in their meta data. And companies could verify if the generated text appearing on their platforms is AI or not.

I think with government regulations we could reduce the number of scams, and misinfo.

Though I do understand it’s limitations and that it would not be super difficult getting past such restrictions if the party were willing enough.

And it wouldn’t work for text, and code I suppose.

But it would be very helpful for indentify AI content much more easily.

Privacy could be maintained as it is with bitcoin and crypto.

But I wonder how you would get everyone to integrate such a system. Both the governments and the companies.


I remember reading about this discussion in some thread months ago, but I can’t find it.

I didn’t know where to post this, but I assume everyone on lemmy is some kind of wiz, so even in this subreddit someone might have the answer.

  • pugnaciousfarter@literature.cafeOP
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    3 days ago

    The point was to mitigate the harm I suppose, not entirely stop it.

    As for the mandatory dpxxing, you’d just be registering on the block chain, but wouldn’t give away any other information.

    So the only info that you give away is - whether something is real or not.

    • SomeoneSomewhere@lemmy.nz
      link
      fedilink
      arrow-up
      1
      ·
      3 days ago

      Are you talking about the AI generator registering on the blockchain? Because there is essentially no incentive for them to do so and every incentive for them not to.

      If you mean genuine camera images being registered on the blockchain, that would give away at minimum the time the image was taken, and probably what kind of device it was taken with and all other images taken by the same user. That’s a lot of data.