That is extra weird when thinking about the audience who might be Vantage.sh users (and thus have the ability to create the read-only token mentioned elsewhere) but would almost certainly be using it from their workstation, in a commercial context. Sounds like you're trying to keep someone from selling your MCP toy and decided to be cute with the licensing text
> Hey Dave, it is as it is stated. The MCP is published with dual licenses depending on your intent. > https://fossa.com/blog/dual-licensing-models-explained/
After further discussion on the ticket the license is now just MIT.
One is the MIT license does not prohibit selling. And wrapping it in a "for non-commercial uses" clause creates a contradiction difficult, if not impossible to enforce.
That said, given https://github.com/runsecret/rsec#aws-secrets-manager presumably in order to keep AWS credentials off disk one would then have to have this?
"vantage-mcp-server": {
"command": "/opt/homebrew/bin/aws-vault",
"args": [
"exec", "--region=us-east-1", "my-awesome-profile",
"--", "/opt/homebrew/bin/rsec", "run",
"--", "/opt/homebrew/bin/vantage-mcp-server"
],
"env": {"VANTAGE_BEARER_TOKEN": "rsec://012345678912/sm.aws/VantageBearerToken?region=us-east-1"}
}
in contrast to the op binary that is just one level of indirection, since they already handshake with the desktop app for $(op login) purposesI agree RunSecret adds a level of indirection at this stage that op doesn’t (if you are using 1pass). This is something I plan to polish up once more vaults are supported. You’ve given me some ideas on how to do that here.
And thanks for the advice on doing a Show HN, planning to do so once a few more rough edges are smoothed out.
That being said, an easier-to-distribute user experience would be to leverage short-lived OAuth tokens that LLM clients such as Claude or Goose ultimately manage for the user. We’re exploring these avenues as we develop the server.
If you really really really need to use static creds on your laptop, use aws-vault to export them, or ephemeral creds generated from them, into your environment.
The biggest is giving the LLM context. On Vantage we have a primitive called a "Cost Report" that you can think of as being a set of filters. So you can create a cost report for a particular environment (production vs staging) or by service (front-end service vs back-end service). When you ask questions to the LLM, it will take the context into account versus just looking at all of the raw usage in your account.
Most of our customers will create these filters, define reports, and organize them into folders and the LLM takes that context into account which can be helpful for asking questions.
Lastly, we support more providers beyond AWS so if you wanted to merge in other associated costs like Datadog, Temporal, Clickhouse, etc.
Now we only have poor IAM UX to fall back on.
/s