1 pointby sossoecho2 hours ago1 comment
  • sossoecho2 hours ago
    We've been building AI apps at GoReal-AI for 2 years. The #1 pain we kept hitting: prompt chaos. Prompts hardcoded as strings everywhere, no versioning, redeployment for every tweak, zero collaboration between engineers and prompt writers.

    So we built PLP (Prompt Library Protocol) -- an open spec for managing prompts via REST. Three endpoints: GET, PUT, DELETE. That's the whole protocol.

    What's shipped:

    Full protocol spec + OpenAPI schema JavaScript SDK (@goreal-ai/plp-client) Python SDK (plp-client) Express middleware reference server We also built EchoStash (echostash.app) as a production implementation on top of PLP, but the protocol itself is fully open, MIT licensed. Anyone can implement a PLP-compliant server.

    Curious what HN thinks -- is prompt management a real problem you're hitting? What would you want from a protocol like this?

    Happy to answer questions about the design decisions.