• @Repelle@lemmy.world
    link
    fedilink
    English
    221 month ago

    Super disappointed if they’re doing this off-device. If we’re getting more language model crap, at least make it local, please.

    • @WalnutLum@lemmy.ml
      link
      fedilink
      English
      41 month ago

      The problem is notably “powerful”, AIs need pretty significant hardware to run well

      As an example the snapdragon NPUs I think can barely handle 7B models.