Advertisement
Advertisement
⚡ Community Insights
Discussion Sentiment
100% Positive
Analyzed from 201 words in the discussion.
Trending Topics
#models#model#made#don#gpu#tokens#per#second#smaller#especially
Discussion Sentiment
Analyzed from 201 words in the discussion.
Trending Topics
Discussion (5 Comments)Read Original on HackerNews
Otherwise, if you have a GPU with more than like 4GB of VRAM, there are better models. Gemma4 and Qwen3.6 (or Qwen3.5 if you need the smaller dense models that haven't yet been released for 3.6) are a good place to start.