Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he “absolutely” believes that Amazon will soon start charging a subscription fee for Alexa

  • arthurpizza
    link
    fedilink
    English
    1259 months ago

    We need to move AI from the cloud to our own hardware running in our homes. Free, open source, privacy focused hardware. It’ll eventually be very affordable.

    • LEX
      link
      fedilink
      English
      52
      edit-2
      9 months ago

      That’s already here. Anyone can run AI chatbots similar to, but not as intelligent as, Chatgpt or Bard.

      Llama.cpp and koboldcpp allow anyone to run models locally, even with only a CPU if there’s no dedicated graphics card available (although more slowly). And there are numerous open source models available that can be trained for just about any task.

      Hell, you can even run llama.cpp on Android phones.

      This has all taken place in just the last year or so. In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.

      • @Zetta@mander.xyz
        link
        fedilink
        English
        8
        edit-2
        9 months ago

        Yes, and you can run a language model like Pygmalion Al locally on koboldcpp and have a naughty AI chat as well. Or non sexual roleplay

        • LEX
          link
          fedilink
          English
          99 months ago

          Absolutely and there are many, many models that have iterated on and surpassed Pygmalion as well as loads of uncensored models specifically tuned for erotic chat. Steamy role play is one of the driving forces behind the rapid development of the technology on lower powered, local machines.

            • LEX
              link
              fedilink
              English
              3
              edit-2
              9 months ago

              Huggingface is where the models live. Anything that’s uncensored (and preferably based on llama 2) should work.

              Some popular suggestions at the moment might be HermesLimaRPL2 7B and MythomaxL2 13B for general roleplay that can easily include nsfw.

              There are lots of talented people releasing models everyday tuned to assist with coding, translation, roleplay, general assistance (like chatgpt), writing, all kinds of things, really. Explore and try different models.

              General rule: if you don’t have a dedicated GPU, stick with 7B models. Otherwise, the bigger the better.

          • @Zetta@mander.xyz
            link
            fedilink
            English
            19 months ago

            Which models do you think beat Pygmalion for erotic roleplay? Curious for research haha

            • LEX
              link
              fedilink
              English
              1
              edit-2
              9 months ago

              Hey, I replied below to a different post with the same question, check it out.

                • LEX
                  link
                  fedilink
                  English
                  19 months ago

                  lol nothing to be sorry about, I just wanted to make sure you saw it.

        • LEX
          link
          fedilink
          English
          29 months ago

          Thanks for this, I haven’t tried GPT4All.

          Oobabooga is also very popular and relatively easy to run, but it’s not my first choice, personally.

        • LEX
          link
          fedilink
          English
          1
          edit-2
          9 months ago

          13B quantized models, generally the most popular for home computers with dedicated gpus, are between 6 and 10 gigs each. 7B models are between 3 and 6. So, no, not really?

          It is relative so, I guess if you’re comparing that to an atari 2600 cartridge then, yeah, it’s hella huge. But you can store multiple models for the same storage cost as a single modern video game install.

          • @scarabic@lemmy.world
            link
            fedilink
            English
            19 months ago

            Yeah that’s not a lot. I mean… the average consumer probably has 10GB free on their boot volume.

            It is a lot to download. If we’re talking about ordinary consumers. Not unheard of though - some games on Steam are 50GB+

            So okay, storage is not prohibitive.

        • arthurpizza
          link
          fedilink
          English
          19 months ago

          Storage is getting cheaper every day and the models are getting smaller with the same amount of data.

      • @teuast@lemmy.ca
        cake
        link
        fedilink
        English
        19 months ago

        In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.

        You’re probably right, but I kinda hope you’re wrong.

          • @teuast@lemmy.ca
            cake
            link
            fedilink
            English
            39 months ago

            Call it paranoia if you want. Mainly I don’t have faith in our economic system to deploy the technology in a way that doesn’t eviscerate the working class.

            • LEX
              link
              fedilink
              English
              2
              edit-2
              9 months ago

              Oh, you are 100% justified in that! It’s terrifying, actually.

              But what I am envisioning is using small, open source models installed on our phones that can answer questions or just keep us company. These would be completely private, controlled by the user only, and require no internet connection. We are already very close to this reality, local AI models can be run on Android phones, but the small AI “brains” that are best for phones are still pretty stupid (for now).

              Of course, living in our current Capitalist Hellscape, it’s hard not to imagine that going awry to the point where we’ll all ‘rent’ AI from some asshole who spies on everything we do, censors the AI for our own ‘protection’, or puts ads in there somehow. But I guess I’m a dreamer.

    • @pyldriver@lemmy.world
      link
      fedilink
      English
      239 months ago

      God I wish, I would just love local voice control to turn my lights and such on and off… but noooooooooooo

        • @pyldriver@lemmy.world
          link
          fedilink
          English
          19 months ago

          I have home assistant, but have not heard anything good about rhasspy. Just want to control lights and be able to use it to play music and set timers. That being said I run home assistant right now and can control it with Alexa and Siri but… I would like local only

      • 🇰 🔵 🇱 🇦 🇳 🇦 🇰 ℹ️
        link
        fedilink
        English
        -1
        edit-2
        9 months ago

        I have that with just my phone, using Wiz lights and ITEEE. It’s the only home automation I even have because it’s the only one I found that doesn’t necessarily need a special base station like an Alexa or Google Home.

        • @AA5B@lemmy.world
          link
          fedilink
          English
          2
          edit-2
          9 months ago

          But you want a local base station, else there’s no local control. You want to use local-only networks like z-wave, zigbee, Thread, Bluetooth, etc, even though they require a base station because that’s what gives you a local-only way of controlling things.

          Matter promises a base station may no longer be necessary for smart devices to control each other, but it is rolling out very slowly

          I also wonder what I’ll be able to do with the Thread radio in the iPhone 15 Pro

          • 🇰 🔵 🇱 🇦 🇳 🇦 🇰 ℹ️
            link
            fedilink
            English
            -1
            edit-2
            9 months ago

            The base stations are what uses the cloud/AI shit. The setup I have doesn’t even require an Internet connection or wifi; it’s entirely bluetooth. Why in the hell would I want a base station that costs money, is controlled by Amazon or Google, and requires an Internet connection for my local shit?

            I don’t want a piece of hardware that does nothing but act like a fucking middleman for no good reason.

            • @foggenbooty@lemmy.world
              link
              fedilink
              English
              19 months ago

              That is not necessarily true. Some base stations use the internet, yes, but not all. For example a Philips hue does not require internet access, nor does Lutron Caseta. As the other person posted, Home Assistant is the absolute best (IMO) way to do everything locally without the internet.

              Your system, while it might work for you, does not scale well due to the limited range and reliability of Bluetooth. You’d likely be better off to adopt a more robust protocol like Z-wave, or ZigBee and get a hub that you have full control over.

    • a1studmuffin 🇦🇺
      link
      fedilink
      English
      129 months ago

      It’s the year of the voice for Home Assistant. Given their current trajectory, I’m hopeful they’ll have a pretty darn good replacement for the most common use cases of Google Home/Alexa/Siri in another year. Setting timers, shopping list management, music streaming, doorbell/intercom management. If you’re on the fence about a Nabu Casa subscription, pull the trigger as it helps them stay independent and not get bought out or destroyed by commercial interests.

      • @AA5B@lemmy.world
        link
        fedilink
        English
        49 months ago

        Thumbs up for Nabu Casa and Home Assistant!

        I haven’t yet played with the local voice stuff but have been following it with interest. Actually, now that Taspberry Piis are starting to become available again, I’m on the fence between buying a few more, vs finding something with a little more power, specifically for voice processing

        • @foggenbooty@lemmy.world
          link
          fedilink
          English
          19 months ago

          Get something with a little more power. Pi’s are reaching outside the price where they make sense these days. You can get an Intel N100 system on AliExpress/Amazon for pretty cheap now and I’ve got mine running ProxMox hosting all kinds of stuff.

    • Captain Aggravated
      link
      fedilink
      English
      49 months ago

      I do wonder how much of those voice assistants could run on-device. Most of what I use Bixby for (I know. I KNOW.) is setting timers. I think simple things like that can run entirely on the phone. It’s got a shocking amount of processing in it.

    • @AA5B@lemmy.world
      link
      fedilink
      English
      39 months ago

      While you may have points against Apple and how effective Siri may be, with this latest version kind of products, even the watch has enough processing power to do voice processing on device. No ads. No cloud services

      • @whofearsthenight@lemm.ee
        link
        fedilink
        English
        19 months ago

        Pretty much. If you want a voice assistant right now, Siri is probably the best in terms of privacy. I bought a bunch of echos early, then they got a little shitty but I was in, and now I just want them out of my house except for one thing - music. Spotify integration makes for easy multi-room audio in a way that doesn’t really work as well on the other platform that I’ll consider (Apple/Siri) and basically adds sonos-like functionality for a tiny fraction of the price. The Siri balls and airplay are just not as good, and of course, don’t work as well with Spotify.

        But alexa is so fucking annoying that at this point I mostly just carry my phone (iPhone) and talk to that even though it’s a little less convenient because I’m really goddamned tired of hearing “by the way…”