35 pointsby icarito6 days ago3 comments
  • guessmyname6 days ago
    It’d be better if it was written in C or at least Vala. With Python, you have to wait a couple hundred milliseconds for the interpreter to start, which makes it feel less native than it can be. That said, the latency of the LLM responses is higher than the UI, so I guess the slowness of Python doesn’t matter.
    • icarito6 days ago
      Yeah I agree, I've been thinking about using Rust. But ultimately it's a problem with GTK3 vs GTK4 too because if we could reuse the Python interpreter from the applet that would speed things up but GTK4 doesn't have support for AppIndicator icons(!).

      I've been pondering whether to backport to GTK3 for this sole purpose. I find that after the initial delay to startup the app, its speed is okay...

      Porting to Rust is not really planned because I'd loose the llm-python base - but still something that triggers my curiosity.

    • cma6 days ago
      What's the startup time now with 9950X3D, after a prior start so the pyc's are cached in RAM?
      • icarito5 days ago
        Hey I felt bad that there was a longer delay and by making sure to lazy-load everything I could, I managed to bring down the startup time from 2.2 seconds to 0.6 on my machine! Massive improvement! Thanks for the challenge!
        • cma4 days ago
          nice that's a huge difference
      • cma5 days ago
        With a laptop 7735HS, using WSL2, I get 15ms for the interpreter to start and exit without any imports.
        • icarito5 days ago
          I've got a i5-10210U CPU @ 1.60GHz.

          You triggered my curiosity. The chat window takes consistently 2.28s to start. The python interpreter takes roughly 30ms to start. I'll be doing some profiling.

      • icarito6 days ago
        I wonder! In my more modest setup, it takes a couple of seconds perhaps. After that it's quite usable.
  • Gracana6 days ago
    This looks quite nice. I would like to see the system prompt and inference parameters exposed in the UI, because those are things I'm used to fiddling with in other UIs. Is that something that the llm library supports?
    • icarito6 days ago
      Yeah absolutely, I've just got to point where I'm happy with the architecture so I'll continue to add UI. I've just added support for fragments and I've thought to add them as if they were attached documents. I've in the radar to switch models in mid conversation and perhaps the ability to rollback a conversation or remove some messages. But yeah, system prompt and parameters would be nice to move too! Thanks for the suggestions!
      • Gracana6 days ago
        Awesome. It would be great to see a nice gtk-based open source competitor to lm-studio and the like.
  • indigodaddy6 days ago
    Does this work on Mac or Linux only?
    • icarito6 days ago
      I'd truly like to know! But I've no access to a Mac to try. If you can, try it and let me know? If it does, please send a screenshot!