1 pointby halilhp2 hours ago2 comments
  • halilhp2 hours ago
    Every new Claude Code session starts from zero. Your AI doesn't remember yesterday's 3-hour debugging session, the architecture decisions from last week, or the approaches that already failed.

      MemoTrail is an MCP server that gives your AI coding assistant persistent
      memory. It automatically indexes every past conversation, embeds them locally
      using sentence-transformers, and makes everything semantically searchable.
    
      Setup is two commands:
    
        pip install memotrail
        claude mcp add memotrail -- memotrail serve
    
      After that, you can ask things like "Why did we choose Redis?" and the AI
      will find the relevant context from any past session — even months ago.
    
      How it works:
      - Reads Claude Code session logs from ~/.claude/
      - Chunks conversations into meaningful segments
      - Embeds with all-MiniLM-L6-v2 (~80MB, CPU only)
      - Stores vectors in ChromaDB, metadata in SQLite
      - Exposes 6 MCP tools (search_chats, get_decisions, save_memory, etc.)
    
      Everything runs locally — no cloud, no API keys, no data leaves your machine.
      MIT licensed.
    
      Currently supports Claude Code. Cursor and Copilot collectors are on the roadmap.
    
      I built this because I kept losing context between sessions. Would love feedback
      on the approach and what features would be most useful.
  • halilhp2 hours ago
    Hi HN, I'm the author. I built MemoTrail because I was frustrated with losing context between Claude Code sessions. Every new session starts from scratch — the AI has no idea what you discussed yesterday.

      MemoTrail runs as an MCP server, automatically indexes your past sessions,
      and makes them semantically searchable. It's completely local — no cloud,
      no API keys.
    
      Currently it supports Claude Code only, but Cursor and Copilot collectors
      are planned. Would love to hear what features you'd find most useful.
    
      Happy to answer any questions!