Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he “absolutely” believes that Amazon will soon start charging a subscription fee for Alexa

  • @hark@lemmy.world
    link
    fedilink
    English
    119 months ago

    Running AI may be currently expensive, but the hardware will continue to improve and get cheaper. If they institute a subscription fee and people actually pay for it, they’ll never remove that fee even after it becomes super cheap to run.

    • @Elderos
      link
      English
      49 months ago

      That is sort of the issue when mixing good conscience with capitalism. Either the goods are valued at what we’re willing to pay, or either they’re valued at what we think the profit margin of the business should be, but mixing the two ultimately leads us to fall for PR crap. Business are quick to gather sympathy when the margins are low, and we fall for this PR crap, but then as soon they own a part of the market it turns into raising the price as much as they possibly can.

      That being said, Amazon became what it is because Bezos was hell bent on not rug pulling customers, at least in the early years, so it is possible they would decrease prices eventually to gain market advantage, that’s their whole strategy.

    • @barsoap@lemm.ee
      link
      fedilink
      English
      3
      edit-2
      9 months ago

      but the hardware will continue to improve and get cheaper.

      Eh. I mean sure the likes of A100s will invariably get cheaper because they’re overpriced AF, but there isn’t really that much engineering going into those things hardware-wise: Accelerating massive chains of fmas is quite a smaller challenge than designing a CPU or GPU. Meanwhile moore’s law is – well maybe not dead but a zombie. In the past advances in manufacturing meant lower price per transistor, that hasn’t been true for a while now and the physics of everything aren’t exactly getting easier, they’re now battling quantum uncertainty in the lithography process itself.

      Where there might still be significant headways to be made is by switching to analogue, but, eeeh. Reliability. Neural networks are rather robust against small perturbations but it’s not like digital systems can’t make use of that by reducing precision, and controlling precision is way harder in analoge. Everything is harder there, it’s an arcane art.


      tl;dr: Don’t expect large leaps, especially not multiple. This isn’t a naughts “buy a PC twice as fast at half the price two years later” kind of situation, AI accelerators are silicon like any other they already make use of the progress we made back then.