• amzd
    link
    fedilink
    110 months ago

    If you have a GPU in your pc it’s almost always faster to just run your own llm locally. And you won’t have this issue. Search for ollama.