Quick share for anyone interested in LLM infrastructure:
Hey folks! Sharing an open-source project that might be useful:
Lynkr connects AI coding tools (like Claude Code) to multiple LLM providers with intelligent routing.
Key features:
- Route between multiple providers: Databricks, Azure Ai Foundry, OpenRouter, Ollama,llama.cpp, OpenAi
- Cost optimization through hierarchical routing, heavy prompt caching
- Production-ready: circuit breakers, load shedding, monitoring
- It supports all the features offered by claude code like sub agents, skills , mcp , plugins etc unlike other proxies which only supports basic tool callings and chat completions.
Great for:
:white_check_mark: Reducing API costs as it supports hierarchical routing where you can route requstes to smaller local models and later switch to cloud LLMs automatically.
:white_check_mark: Using enterprise infrastructure (Azure)
:white_check_mark: Local LLM experimentation
```bash
npm install -g lynkr
```
GitHub:
https://github.com/Fast-Editor/Lynkr (Apache 2.0)
Would love to get your feedback on this one. Please drop a star on the repo if you found it helpful