5 pointsby gzuuus5 hours ago2 comments
  • quinncom4 hours ago
    I just learned yesterday that ChatGPT (and maybe others) can’t connect to a MCP running on localhost; it needs an endpoint on the public internet. (I guess because the request comes from OpenAI servers?)

    I’d rather not expose a private MCP to the public, so ContextVM sounds like a step in the right direction. But I’m confused about how it is called: doesn’t OpenAI’s servers still need you to provide a public endpoint with a domain name and TLS? Or does it use a Nostr API?

    • gzuuus3 hours ago
      Interesting, I didn't know about that. It could be for security reasons or to lock users into their platform tools, but it seems odd.

      If you can still connect to a stdio MCP server, you can plug it into a remote MCP server exposed through ContextVM. You can do this using the CVMI CLI tool, or if you need custom features, the SDK provides the primitives to build a proxy. For example, using CVMI, you could run your server over Nostr. You can run an existing stdio server with the command `npx cvmi serve -- <your-command-to-run-the-server>` or a remote HTTP server with the command `npx cvmi serve -- http(s)://myserver.com/mcp`. This makes your server available through Nostr, and you will see the server's public key in your terminal.

      Locally, you can then use the command `npx cvmi use <server-public-key>` to configure it as a local stdio server. The CLI binds both transports, Nostr <-> stdio, so your remote server will appear as a local stdio server. I hope this clarifies your question. For more details, see the documentation at https://docs.contextvm.org. Please ask if you have any other questions :)

  • aaaljaz5 hours ago
    this looks great, will need to give it a go!

    how do you see the skills vs mcp playing out in relation to this tho?

    • gzuuus4 hours ago
      Hey! Thanks :)

      From my perspective Skills and MCP complement each other: skills orchestrate tasks, while MCP implements them. This synergy has been explored in some articles and the recent MCP code mode trend. MCP serves as a standard for capability execution, and skills aid in progressive disclosure and token saving. I've also seen a trend of exposing skills as MCP resources using the 'skill://' prefix.

      Different working groups are leveraging MCP's standardness to save tokens, such as using an MCP server with just 'search' and 'execute' tools.

      On the CVM side, we're developing CVMI, a CLI tool for installing CVM related skills and serving or using servers as regular MCPs. Soon, CVMI will also enable calling CVM servers, allowing you to create scripts using just bash and CVMI