7 pointsby manveerc3 hours ago1 comment
  • briefrrapp3 hours ago
    You mentioned using ClickHouse to store 'soft context' like Slack threads and postmortems alongside hard telemetry. Are you suggesting storing these as vectorized embeddings directly in ClickHouse using their Vector Search capabilities, or keeping them as raw text and letting the LLM parse them via SQL?
    • manveerc3 hours ago
      Thats a good question. I would recommend MCP for the bulk of 'chatty' soft data to keep the database clean. However, you should selectively ingest 'high value' data into ClickHouse for vector search.

      For e.g. you wouldn't ingest every 'good morning' message. But once an incident is resolved, you could ETL specific threads (filtering out noise) and the resulting RCA into ClickHouse as a vectorized document. That way, the copilot can recall the solution 6 months later without depending on Slack.