• @FrostyTrichs@lemmy.world
    link
    fedilink
    English
    671 year ago

    Why the hell can’t we just have both? One of the biggest problems with smart speakers and voice assistants is that they’re so damn stupid so often. If A.I. were to become smart enough to be what the current assistants/speakers aren’t, surely that would drive device sales and engagement astronomically higher right?

    • @phoneymouse@lemmy.world
      link
      fedilink
      English
      32
      edit-2
      1 year ago

      That would be the goal. The tricky part is matching intents that align with some API integration to whatever psychobabble the LLM spits out.

      In other words, the LLM is just predicting the next word, but how do you know when to take an action like turning on the lights, ordering a pizza, setting a timer, etc. The way that was done with Alexa needs to be adapted to fit with the way LLMs work.

      • Deceptichum
        link
        fedilink
        71 year ago

        Eh just ask the LLM to format requests in a way that can be parsed to a function.

        Its pretty trivial to get an llm to do that.

        • PupBiru
          link
          fedilink
          71 year ago

          in fact it’s literally the basis for the “tools” functionality in the new openai/chatgpt stuff!

          that “browse the web”, “execute code”, etc is all the LLM formatting things in a specific way

      • @floofloof@lemmy.ca
        link
        fedilink
        English
        5
        edit-2
        1 year ago

        Microsoft seems to be attempting this with the new Copilot in Windows. You can ask it to open applications, etc., and also chat with it. But it is still pretty clunky when it comes to the assistant part (e.g. I asked it to open my power settings and after a bit of to and fro it managed to open the Settings app, after which I had to find the power settings for myself). And they’re planning to charge for it, starting at an outrageous $30 per month. I just don’t see that it’s worth that to the average user.

      • @ExLisper@linux.community
        link
        fedilink
        English
        31 year ago

        It’s actually fairly easy. "I’m a computer. From now on only communicate with me in valid JSON in the format of {“command”: “name”, “parameters”: []}. Possible commands are “toggle_lights”, “pizza”, “set_timer”. And so on and so on. Current models are remarkably good at responding with valid JSON, I didn’t have any issues with that. They will still hallucinate about details (like what it would do if you try to set up a timer for pizza?) but I’m sure you can train those models to address those issues. I was thinking about doing a OpenAI/google assistant bridge myself for spotify. Like “Play me that Michael Jackson song with that videoclip with monsters”. Current assistant can’t handle that but you can just ask chatGPT for the name of the song and then pass it to the assistant. This is what they have to do but on a bigger scale.

    • @asdfasdfasdf@lemmy.world
      link
      fedilink
      English
      141 year ago

      I just tried the new OpenAI voice conversation feature and thought about this too. It’s everything I had hoped and dreamed that voice assistants would be when they first came out. It’s really surprising that the ones from huge tech companies suck so much.

      • paraphrand
        link
        fedilink
        English
        51 year ago

        The tech to make them as good as what you just tried only came about more recently.

        Voice assistants, particularly Siri, are structured in a VERY different way.

    • KᑌᔕᕼIᗩ
      link
      fedilink
      English
      121 year ago

      Because the elephant in the room is that AI isn’t actually AI but is a huge database of internet and creative content combined with a language processing tool that takes its best guess at how to respond with that information to you.

    • @yesman@lemmy.world
      link
      fedilink
      English
      91 year ago

      We can’t have both because Alexa’s job is not to give customers a good experience, it’s to make them comfortable re-ordering Tide Pods with their voice.

      Even households with Prime and an eco in every room don’t trust that bitch with their credit card. Making her smart won’t fix that; she’s a failure.