I have a custom learning system. We are all trying things, that's where ai development is.
None of us know the best solution. We are all exploring in paths. I don't find memory and persistent long term context to be an issue for me, but I am using a full custom ai claude code setup, so perhaps I have sorted it for myself. Unsure.
Can you give a specific example? Like, talk through your workflow so I can understand it better?
The full context then looks something like: [intro prompt] + [old exhanges lvl 1 summaries] + [larger system prompt] + [more recent exchanges lvl 0 summaries] + [temporal context] + [recent messages with tool results stripped] + [recent messages including tool results]
Tool results are progressively stripped because they are generally only useful for a few turns. This allows to keep everything we've ever done in the context, and the model can easily look up more information by expanding each node. It's a single perpetual session that never compacts during active work.
I find it outperforming every other solution I tried for my use case (personal assistant).