2 pointsby joshgpurvis7 hours ago3 comments
  • joshgpurvis7 hours ago
    I got tired of repeating myself to every AI tool I use. My dietary restrictions, my tech stack, my family's names — every new agent starts from zero. So I built an open source personal database that any AI agent can read from and write to via MCP – Epitome.

    It's five layers on top of Postgres: structured tables, a portable identity profile, semantic vector search (pgvector), a knowledge graph that auto-extracts entities and relationships, and a confidence-scored memory quality system that lets memories decay or get reinforced over time. Each user gets their own Postgres schema — not RLS, actual schema isolation.

    Agents connect over Streamable HTTP with OAuth and granular consent controls. You decide which agent sees what. Append-only audit log for everything.

    MIT licensed, self-hostable with docker compose up, or use the hosted version. Built with Hono, React, D3.js for graph viz.

    GitHub: https://github.com/gunning4it/epitome

    Would love feedback on the architecture and what's missing. Happy to go deep on any of the technical decisions or feel free to contribute.

  • joshgpurvis6 hours ago
    It's a personal knowledge graph, so expecting thousands of nodes, not billions.

    At that scale, PostgreSQL with recursive CTEs is more than fast enough, and it means one database does everything.

    Though I can imagine migrating to neo4j if necessary.

  • builderlore6 hours ago
    Why PostgreSQL for the graph instead of a purpose-built graph database like Neo4j?