5 pointsby nvader3 days ago1 comment
  • billylo2 days ago
    Windows and macOS does come with a small model for generating text completion. You can write a wrapper for your own TUI to access them platform agnostically.

    For consistent LLM behaviour, you can use ollama api with your model of choice to generate. https://docs.ollama.com/api/generate

    Chrome has a built-in Gemini Nano too. But there isn't an official way to use it outside chrome yet.

    • nvader2 days ago
      Is there a Linux-y standard brewing?
      • billyloa day ago
        Each distro is doing their own thing. If you are targeting Linux mainly, I would suggest to code it on top of ollama or LiteLLM