Back to News
Advertisement
Advertisement

⚑ Community Insights

Discussion Sentiment

100% Positive

Analyzed from 262 words in the discussion.

Trending Topics

#models#local#codex#frontier#both#point#currently#burn#sure#ever

Discussion (8 Comments)Read Original on HackerNews

kilroy123β€’about 7 hours ago
Both are down for me. :-/ I'm currently in Eastern Europe.
AustinDevβ€’about 7 hours ago
Both currently working in US.
lrvickβ€’about 6 hours ago
Burn baby burn.

Meanwhile, you can always buy hardware like a Strix Halo and have local LLMs that no third party can take away from you.

virgildotcodesβ€’about 5 hours ago
I really wish local models could compete with Codex, but they are miles apart for now. I'm not sure how they would ever not be, unless local models at some point in the future catch up to the current state of 5.4 high.

Even then, the frontier models would likely have improved by an equivalent degree, so you'd again be faced with the same choice of deciding between a dramatically less effective local tool and a far more capable, closed remote model.

I guess there's going to be some point of "good enough" for most people.

I feel like the closed frontier models really got there around 8 months ago and then even moreso ~4-6 months ago with the release of the Codex series and then opus 4.6. Finally feels like you can get reliably good implementations of features that follow repo patterns and best practices, and at least with 5.4 High/Xhigh Codex, code reviews that don't mostly surface hallucinated or superficial bullshit.

While I'm rambling, I feel like when/if local models ever do catch up to this point, the frontier models are going to be so damn good that software devs are truly fucked.

Archit3chβ€’about 4 hours ago
What's the alternative to frontier models? Disk-streamed GLM 5.1? By the time you get a single response back, the API will be back up.
andyfilms1β€’about 5 hours ago
Sure, but unless you're training them yourself they can still be compromised with poisoning or bias. They're still black boxes even if you're running them locally.
rvzβ€’about 5 hours ago
I would have expected Claude to take time off first. It turns out that both ChatGPT and Codex decided to take some time off on vacation today.