Back to News
Advertisement
Advertisement

⚡ Community Insights

Discussion Sentiment

100% Positive

Analyzed from 121 words in the discussion.

Trending Topics

#implementation#working#project#llm#continue#memory#context#mcp#memeory#general

Discussion (4 Comments)Read Original on HackerNews

great_psyabout 1 hour ago
LLM Memeory (in general, any implementation) is good in theory.

In practice, as it grows it gets just as messy as not having it.

In the example you have on front page you say “continue working on my project”, but you’re rarely working on just one project, you might want to have 5 or 10 in memory, each one made sense to have at the time.

So now you still have to say, “continue working on the sass project”, sure there’s some context around details, but you pay for it by filling up your llm context , and doing extra mcp calls

dennisy13 minutes ago
True! But this is a very naive implementation, a proper implementation could surpass these challenges.
dennisyabout 1 hour ago
Congratulations on the launch!

There is lots of competition in this space, how is your tool different?

alash3alabout 7 hours ago
Platform memory is locked to one model and one company. Stash brings the same capability to any agent — local, cloud, or custom. MCP server, 28 tools, background consolidation, Apache 2.0.