2 pointsby bcorp5 hours ago1 comment
  • bcorp5 hours ago
    I wanted to see how far back I could push an AI agent — not the LLM itself, but the client that talks to it, parses tool calls, and acts on the results.

    retro-agent is a terminal-based AI agent written in Zig 0.15. It connects to Ollama (or any OpenAI-compatible API) over local HTTP, supports function calling, and provides built-in tools for system diagnostics: processes, network, disk, services, memory, and arbitrary command execution.

    The target is Windows XP SP3 x86 on hardware as old as a Pentium III with 64 MB RAM. The binary is ~750 KB, single-threaded, no dependencies, no UCRT/MSVC runtime. It also cross-compiles to Linux x86/x64/ARM.

    Some things I had to deal with:

    - Win32 Console API for the TUI (CP437 box-drawing, colored output) - Automatic CP850 → UTF-8 conversion for localized Windows command output - UTF-8 → ASCII sanitization for console display - A compatibility shim for RtlGetSystemTimePrecise (doesn't exist on XP) - Conversation history with a sliding window to stay within memory limits - Command whitelist and approval mode for security

    The LLM runs on a separate machine on the network — Ollama can't run on XP. The agent is just the thin client: parse, call tools, feed results back, loop until you get a text response (max 10 iterations).

    Tested with llama3, qwen2, mistral, and command-r. Any model with function calling support works.

    MIT licensed. Would love feedback, especially from anyone still managing legacy Windows systems or working with Zig's cross-compilation story.

    https://github.com/benmaster82/retro-agent

    • fithisux3 hours ago
      Is zig able to target WindowsXP?