1 pointby zoudong37611 hours ago1 comment
  • brucehoult4 hours ago
    > Instead of running large runtimes locally, it acts as a lightweight agent client and delegates reasoning to cloud LLM APIs (GLM/GPT/Claude), while keeping orchestration local.

    I thought that's what OpenClaw already is -- it can use a local LLM if you have one, but doesn't have to. If it's intrinsically heavy that's only because it's Javascript running in node.js.