• @jagungal@lemmy.world
    link
    fedilink
    145 months ago

    Nobody has been able to make a convincing argument in favour of generative AI. Sure, it’s a tool for creating art. It abstracts the art making process away so that the barrier to entry is low enough that anyone can use it regardless of skill. A lot of people have used these arguments to argue for these tools, and some artists argue that because it takes no skill it is bad. I think that’s beside the point. These models have been trained on data that is, in my opinion, both unethical and unlawful. They have not been able to conclusively demonstrate that the data was acquired and used in line with copyright law. That leads to the second, more powerful argument: they are using the labour of artists without any form of compensation, recognition, permission, or credit.

    If, somehow, the tools could come up with their own styles and ideas then it should be perfectly fine to use them. But until that happens (it won’t, nobody will see unintended changes in AI as anything other than mistakes because it has no demonstrable intent) use of a generative AI should be seen as plagiarism or copyright infringement.

    • @computerscientistII@lemm.ee
      link
      fedilink
      35 months ago

      So, how do art students learn? They are doing the exact same things. Only they do a lot less, because natural neural networks (aka brains) are not capable of processing training data as quickly. It’s not as if every artist has to reinvent the wheel and generative AIs don’t and as such have an unfair advantage.

      Look at inventions like the printing press! Did everybody like it? The catholic church certainly didn’t! Is it a a phantastic peace of technology anyway? Sure is!

      • @djsoren19@yiffit.net
        link
        fedilink
        45 months ago

        Students learn techniques that they apply to their own personal style. The goal of art school isn’t to create a legion of artists that can churn out identical art, it’s to give young creatives the tools they need to realize the ideas in their head.

        AI has no ideas in it’s head. Instead, it takes in a bunch of an artists work, and then produces something that does it’s best to match the plagiarized artist’s style exactly.

        • @computerscientistII@lemm.ee
          link
          fedilink
          15 months ago

          We don’t know if there’s much of a difference between an AI and a brain. If we look “into” an artificial neural net we only see a lot of “weights” that don’t make any sense. If we look into a brain, we also can’t make any more sense of it. The difference is smaller than we want it to be because it seems to either give AIs a lot of credit or it makes us less than we want to be. We are very biased, don’t forget that.

      • @jagungal@lemmy.world
        link
        fedilink
        45 months ago

        Copyright gives the copyright holder exclusive rights to modify the work, to use the work for commercial purposes, and attribution rights. The use of a work as training data constitutes using a work for commercial purposes since the companies building these models are distributing licencing them for profit. I think it would be a marginal argument to say that the output of these models constitutes copyright infringement on the basis of modification, but worth arguing nonetheless. Copyright does only protect a work up to a certain, indefinable amount of modification, but some of the outputs would certainly constitute infringement in any other situation. And these AI companies would probably find it nigh impossible to disclose specifically who the data came from.

        • Uriel238 [all pronouns]
          link
          fedilink
          English
          55 months ago

          Copyright gives the copyright holder exclusive rights to modify the work, to use the work for commercial purposes, and attribution rights.

          Copyright remains a system of abuse that empowers large companies to restrict artistic development than it does encourage artists. Besides which, you’re failing to consider transformative work.

          As it is, companies like Disney, Time Warner and Sony have so much control over IP that artists can’t make significant profit without being controlled by those companies, and then only a few don’t get screwed over.

          There are a lot of valid criticisms about AI, but the notion that training them on work gated by IP law is not one of them… Unless you mean to also say that human beings cannot experience the work either.