• Repple (she/her)
    link
    fedilink
    English
    226 months ago

    Super disappointed if they’re doing this off-device. If we’re getting more language model crap, at least make it local, please.

    • @WalnutLum@lemmy.ml
      link
      fedilink
      English
      46 months ago

      The problem is notably “powerful”, AIs need pretty significant hardware to run well

      As an example the snapdragon NPUs I think can barely handle 7B models.