DE version is available. Content is displayed in original English for accuracy.
I have a proposal that addresses long-term memory problems for LLMs when new data arrives continuously (cheaply!). The program involves no code, but two Markdown files.
For retrieval, there is a semantic filesystem that makes it easy for LLMs to search using shell commands.
It is currently a scrappy v1, but it works better than anything I have tried.
Curious for any feedback!

Discussion (10 Comments)Read Original on HackerNews
The hard part is usually knowing what +not+ to write down. Every system I've seen eventually drowns in low-signal entries.
The problem always is that when there are too many memories, the context gets overloaded and the AI starts ignoring the system prompt.
Definitely not a solved problem, and there need to be benchmarks to evaluate these solutions. Benchmarks themselves can be easily gamed and not universally applicable.
I guess the markdown approach really has a advantage over others.
PS : Something I built on markdown : https://voiden.md/