• @MostlyGibberish@lemm.ee
    link
    fedilink
    English
    -128 months ago

    Because people are afraid of things they don’t understand. AI is a very new and very powerful technology, so people are going to see what they want to see from it. Of course, it doesn’t help that a lot of people see “a shit load of cash” from it, so companies want to shove it into anything and everything.

    AI models are rapidly becoming more advanced, and some of the new models are showing sparks of metacognition. Calling that “plagiarism” is being willfully ignorant of its capabilities, and it’s just not productive to the conversation.

    • @A_Very_Big_Fan@lemmy.world
      link
      fedilink
      English
      -38 months ago

      True

      Of course, it doesn’t help that a lot of people see “a shit load of cash” from it, so companies want to shove it into anything and everything.

      And on a similar note to this, I think a lot of what it is is that OpenAI is profiting off of it and went closed-source. Lemmy being a largely anti-capitalist and pro-open-source group of communities, it’s natural to have a negative gut reaction to what’s going on, but not a single person here, nor any of my friends that accuse them of “stealing” can tell me what is being stolen, or how it’s different from me looking at art and then making my own.

      Like, I get that the technology is gonna be annoying and even dangerous sometimes, but maybe let’s criticize it for that instead of shit that it’s not doing.

      • @MostlyGibberish@lemm.ee
        link
        fedilink
        English
        38 months ago

        I can definitely see why OpenAI is controversial. I don’t think you can argue that they didn’t do an immediate heel turn on their mission statement once they realized how much money they could make. But they’re not the only player in town. There are many open source models out there that can be run by anyone on varying levels of hardware.

        As far as “stealing,” I feel like people imagine GPT sitting on top of this massive collection of data and acting like a glorified search engine, just sifting through that data and handing you stuff it found that sounds like what you want, which isn’t the case. The real process is, intentionally, similar to how humans learn things. So, if you ask it for something that it’s seen before, especially if it’s seen it many times, it’s going to know what you’re talking about, even if it doesn’t have access to the real thing. That, combined with the fact that the models are trained to be as helpful as they possibly can be, means that if you tell it to plagiarize something, intentionally or not, it probably will. But, if we condemned any tool that’s capable of plagiarism without acknowledging that they’re also helpful in the creation process, we’d still be living in caves drawing stick figures on the walls.

      • @Mnemnosyne@sh.itjust.works
        link
        fedilink
        English
        28 months ago

        One problem is people see those whose work may no longer be needed or as profitable, and…they rush to defend it, even if those same people claim to be opposed to capitalism.

        They need to go ‘yes, this will replace many artists and writers…and that’s a good thing because it gives everyone access to being able to create bespoke art for themselves.’ but at the same time realize that while this is a good thing, it also means the need for societal shift to support people outside of capitalism is needed.

        • @MostlyGibberish@lemm.ee
          link
          fedilink
          English
          18 months ago

          it also means the need for societal shift to support people outside of capitalism is needed.

          Exactly. This is why I think arguing about whether AI is stealing content from human artists isn’t productive. There’s no logical argument you can really make that a theft is happening. It’s a foregone conclusion.

          Instead, we need to start thinking about what a world looks like where a large portion of commercially viable art doesn’t require a human to make it. Or, for that matter, what does a world look like where most jobs don’t require a human to do them? There are so many more pressing and more interesting conversations we could be having about AI, but instead we keep circling around this fundamental misunderstanding of what the technology is.