Back to News
Advertisement
Advertisement

⚡ Community Insights

Discussion Sentiment

50% Positive

Analyzed from 157 words in the discussion.

Trending Topics

#claude#compaction#something#code#chat#context#published#tank#investment#opportunities

Discussion (8 Comments)Read Original on HackerNews

flowerthoughts•9 minutes ago
For Claude Code, I feel 1M is enough. I've had a compaction once, but that was because I was forcing Claude to do something it clearly had a hard time understanding.

For general chat bots where the user doesn't understand what a context window is, what do you do about context? Latest few messages and then a memory tool? Compaction?

brianush1•6 minutes ago
Claude does compaction in the regular web chat interface now, too
refibrillator•about 3 hours ago
Previous discussion here (with links to actual primary source):

https://news.ycombinator.com/item?id=48023079

No technical report published yet, unlikely code or weights will be either given VC funding.

Chance-Device•about 2 hours ago
It’s probably something like deepseek’s native sparse attention with content based granularity. They might not be publishing anything because it’s not such a strong value proposition and doing so would lead to commentary that would tank their investment opportunities.
SilverElfin•about 1 hour ago
Or maybe because giving it away would tank their investment opportunities.
regularfry•about 1 hour ago
There's ways and means. Pushing something out in the sub-30B range would gain them mindshare and they could keep bigger models to themselves. I can't see any indication of what size their model is though.
roger_•about 3 hours ago
Have they published?
Bombthecat•about 3 hours ago
I believe it, when I see it.