• @cm0002@lemmy.worldOP
      link
      fedilink
      22 days ago

      Not for gaming, for running AI open source models and other AI shenanigans. My 4080 Super has been filling my gaming needs and will for years to come, but it’s not enough for my AI interests lol

      The most I can get out of this 4080 is running a ~7B param model, but I want to run cooler shit like that new open source DeepSeek v3 that dropped the other day.

      • @Tja@programming.dev
        link
        fedilink
        62 days ago

        So you’re waiting for the AI bubble to burst because you can’t wait to run all the cool new AI models?

        • @cm0002@lemmy.worldOP
          link
          fedilink
          2
          edit-2
          2 days ago

          Yea, the underlying tech is what interests me and I have a few potential use cases. Use cases that I would never entrust a random company with. For example, the concept of MS recall is cool, I’d never trust Microshits implementation though. But an open source local version that I’m in control of all the security implementations? Hell yea lol

          • @Tja@programming.dev
            link
            fedilink
            12 days ago

            That’s the problem. If the use case is super cool, and 99% of people have no knowledge (or motivation) to set it up for themselves, the online services will keep existing, and the bubble won’t really burst.

            Even if some single companies fail (and they will, there some truly horrendous ideas getting funding), the big players will buy the GPUs wholesale before they hit ebay.