cargo install flint-ai flint use TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF
Runs on Apple Silicon via Metal, NVIDIA via CUDA, AMD via ROCm, or any CPU as fallback. Models store in ~/.flint/models and persist across projects.
Still early (v0.1.0) but works. Would love feedback from anyone who tries it.