2 pointsby kochc3 hours ago2 comments
  • kochc3 hours ago
    I use OpenCode which already has GitHub Copilot, Ollama, Anthropic, Gemini etc. configured. But every other tool — Open WebUI, LangChain, my own scripts — needed the same models re-entered separately. This plugin starts a local HTTP server on port 4010 that translates between OpenAI, Anthropic, and Gemini API formats and whichever model OpenCode has configured. So you point any tool at http://127.0.0.1:4010/v1 and it just works.
  • kochc3 hours ago
    Setup: npm install opencode-llm-proxy Add "plugin": ["opencode-llm-proxy"] to opencode.json, start OpenCode, done. Supports streaming for all four formats. 112 tests, MIT license.