2 pointsby ddalcu2 hours ago1 comment
  • ddalcu2 hours ago
    Native Zig server that runs MLX-format language models on Apple Silicon and exposes an OpenAI-compatible HTTP API. No Python.

    Optional app MLX Claw, a macOS menu bar app with built-in chat, agent mode, and model management.

    No dependencies 34MB, very low ram usage compared to other LLM runners.