2 pointsby breezenik16 hours ago1 comment
  • breezenik16 hours ago
    OP here. We've been building agentic tools and realized we were all solving the same problem: passing context to the LLM. We standardized a folder structure (.ai/) and a "boot protocol" that lets any agent (Cursor, Claude, etc.) read project state from markdown files in the repo, effectively giving the repo its own memory.

    We also built a CLI (corepackai) to install these "Context Packs" like npm packages, and a Marketplace to share them (e.g. installing standard Next.js patterns or AWS architecture contexts).

    We just released the full spec and would love feedback on the architecture.