• @suburban_hillbilly@lemmy.ml
    link
    fedilink
    282 months ago

    Bet you a tenner within a couple years they start using these systems as distrubuted processing for their in house ai training to subsidize cost.

    • @8ender@lemmy.world
      link
      fedilink
      English
      62 months ago

      That was my first thought. Server side LLMs are extraordinarily expensive to run. Download to costs to users.