• @tatterdemalion@programming.dev
    link
    fedilink
    74
    edit-2
    11 months ago

    It literally cannot come up with novel solutions because it’s goal is to regurgitate the most likely response to a question based on training data from the internet. Considering that the internet is often trash and getting trashier, I think LLMs will only get worse over time.

    • @space@lemmy.dbzer0.com
      link
      fedilink
      4711 months ago

      AI has poisoned the well it was fed from. The only solution to get a good AI moving forward is to train it using curated data. That is going to be a lot of work.

      On the other hand, this might be a business opportunity. Selling curated data to companies that want to make AIs.

      • @tatterdemalion@programming.dev
        link
        fedilink
        1011 months ago

        I could see large companies paying to train the LLM on their own IP even just to maintain some level of consistency, but it obviously wouldn’t be as valuable as hiring the talent that sets the bar and generates patent-worthy inventions.

        • @MagicShel@programming.dev
          link
          fedilink
          311 months ago

          You can fine tune a model with specific stuff today. OpenAI offers that right on their website and big companies are already taking advantage. It doesn’t take a whole new LLM, and the cost is a pittance in comparison.

      • @DudeDudenson
        link
        1611 months ago

        The reason why chat gpt 3.5 is still great for anything previous to it’s cutoff date. It’s not constantly being updated with new garbage

      • @Obi@sopuli.xyz
        link
        fedilink
        1411 months ago

        Low-background steel, also known as pre-war steel, is any steel produced prior to the detonation of the first nuclear bombs in the 1940s and 1950s. Typically sourced from ships (either as part of regular scrapping or shipwrecks) and other steel artifacts of this era, it is often used for modern particle detectors because more modern steel is contaminated with traces of nuclear fallout.[1][2]

        Very interesting, today I learned.

    • @ArrogantAnalyst@feddit.de
      link
      fedilink
      2611 months ago

      Also the more the internet is swept with AI generated content, the more future datasets will be trained on old AI output rather than on new human input.

    • @test113@lemmy.world
      link
      fedilink
      English
      -16
      edit-2
      11 months ago

      Hi, I don’t want to say too much, but after being invited to some closed AI talks by one of the biggest chip machine manufacturers (if you know the name, you know they don’t mess around), I can tell you AI is, in certain regards, a very powerful tool that will shape some, if not all, industries by proxy. They described it as the “internet” in the way that it will take influence on everybody’s life sooner or later, and you can either keep your finger on the pulse or get left behind. But they distinguished between the “AI” that’s floating around in the public sector vs. actual purpose-trained AI that’s not meant for public usage. Sidenote: They are also convinced the average user of a LLM is using it the “wrong” way. LLMs are only a starting point.

      Also, it’s concerning; I’m pretty sure the big boys have already taken over the AI market, so I do not trust that it will be to the benefit of all of us and not only for a select group (of shareholders) that will reap the benefits.

        • @DudeDudenson
          link
          811 months ago

          Like when they claim your smart thermostat is now “AI powered” despite the fact it’s the same exact product it was 2 years ago

        • @test113@lemmy.world
          link
          fedilink
          English
          -111 months ago

          Again, none of the people at this talk have anything to do with selling a product or pushing an agenda or whatever you think. There is no press, there is no marketing, there is no product - it was basically a meetup of private equity firms that discussed the implementation and impact of purpose-trained AI in diverse fields, which affects the business structure of the big single-family office behemoths, like an industry summit for the private equity sector regarding the future of AI and how some plan to implement it (mainly big non-public SFOs).

          Sometimes people just meet to discuss strategy; no one at these talks is interested in selling you anything or buying anything - they are essentially top management and/or members of large single-family offices and other private equity firms. They are not interested in selling or marketing something to the public; they are not public companies.

          It’s weird how you guys react; not everything is a conspiracy or a marketing thing. It’s pretty normal in private equity to have these closed talks about global phenomena and how to deal with it.

          These talks are more to keep the industry informed. I get that you do not like it when essentially the big SFOs have a meeting where they discuss their future plans on a certain topic, but it’s pretty normal that the elite will arrange themselves to coordinate some investments. It’s essentially just the offices of the big billionaire families coming together to put heads together to discuss a topic that might influence their business structure. But, in no way is it a marketing strategy; it would, on the contrary, be negatively viewed in the public eye that big finance is already coordinating to implement AI into their strategy.

          But feelings don’t change facts. My point is if the actual non public big players are looking at AI in a serious matter, then so should you.

          • @mob@sopuli.xyz
            link
            fedilink
            1
            edit-2
            11 months ago

            Its not a conspiracy… You are obviously not involved in the actual ML/AI, but another sector. You aren’t speaking in any technical explaination.

            A lot of us are involved in the technical aspect and understand what is being said by management.

            • @test113@lemmy.world
              link
              fedilink
              English
              010 months ago

              I never argued that I was in IT/Tech; I deal with investments and PE. I have nothing to do with IT or tech. My point is we, in the PE/FO sector, are going to invest in AI businesses in 24/25, not only in the “B2C market” but mainly in the B2B market and for internal applications. Whether you believe it or not, it’s gonna happen anyway.

      • @wewbull@feddit.uk
        link
        fedilink
        English
        2111 months ago

        So Nvidia (or Intel or AMD) told you that you need to AI to stay competitive. Not only that, but you needed a bespoke solution. Not the toy version out on the net every can get access to.

        Strangely enough, they have some wonderful products coming to market which would be just what you need to build a large training network capable of injesting all your company data. They’d be happy to help you on this project.

        All they had to do to get you to drop your guard was invite you by name to a “closed talk”.

        • @test113@lemmy.world
          link
          fedilink
          English
          011 months ago

          Haha, lol, whats happening why do you hate me, just sharing an experience, an opinion?

          • it’s not NVIDIA or AMD or any chip manufacturer, or someone who has a product to sell to you. Most of them are not even publicly traded but are organized in family office structures. They don’t care about the B2C market at all; they are essentially private equity firms. You guys interpret anything to fit your screwed-up vision of this world. They don’t even have a product to sell to you or me; it was a closed talk with top industry leaders and their managers where they discussed their view of AI and how they will implement purpose-trained AI into manufacturing, etc. It has nothing to do with selling to the public.

          I have already said too much - just let me tell you if you think LLMs are the pinnacle of AI, you are very mistaken, and depending on your position in the market, you need to take AI into account. You can only dismiss AI if you have a position/job with no real responsibility.

          So weird how you guys think everything is to sell you something or a conspiracy - this was a closed talk to discuss how the leaders in certain industries will adapt to the coming changes. They give zero cares about the B2C market, aka you as an individual.

          Again, none of the people at this talk have anything to do with selling a product or pushing an agenda or whatever you think. There is no press, there is no marketing - it was basically a meetup of private equity firms that discussed the implementation and impact of purpose-trained AI in diverse fields, which affects the business structure of the big single-family office behemoths.

      • @Buttons@programming.dev
        link
        fedilink
        English
        311 months ago

        As long as AI isn’t outlawed or “regulated” in some stupid way, open-source AI models will stay competitive. People are interested in AIs and working on them is exciting and doesn’t require a lot of code or other bullshit, this is the type of thing that the open-source community will work on.