4 pointsby blazingbanana2 hours ago2 comments
  • blazingbanana2 hours ago
    Completely free, offline note and text formatting using local LLMs.

    I write a lot of notes that end up unstructured and hard to reuse and make sense of. So the goal was to build a fast, easy, local-first way to turn rough input into usable text without sending anything to the cloud.

    Desktop builds are available for macOS, Windows, and Linux, with optional Windows/Linux CUDA builds for faster inference. Everything runs locally.

    Big shout out to https://github.com/ggerganov/llama.cpp along with PyQt6, pdfplumber, pytesseract, and python-docx for making it easy to wire everything together.

    It uses an 8Gb Phi-4 model, however there's no reason you couldn't point to any other GGUF models you have.

    Still very much a work in progress, any suggestions welcome.

  • 2 hours ago
    undefined