1 pointby Lbrant5 hours ago1 comment
  • Lbrant5 hours ago
    I wanted to run AI in my Rust app without sending data to OpenAI. Built Flint — drop it in and run any GGUF model locally on device.

    cargo install flint-ai flint use TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF

    Runs on Apple Silicon via Metal, NVIDIA via CUDA, AMD via ROCm, or any CPU as fallback. Models store in ~/.flint/models and persist across projects.

    Still early (v0.1.0) but works. Would love feedback from anyone who tries it.