Back to News
Advertisement
Advertisement

⚡ Community Insights

Discussion Sentiment

67% Positive

Analyzed from 331 words in the discussion.

Trending Topics

#things#llm#prod#hallucinations#mathematically#impossible#eliminate#basically#plug#mentality

Discussion (8 Comments)Read Original on HackerNews

NikolaNovak•about 3 hours ago
I work in IT, have all my life, but these stories still have a sense of bizarre unreality to me, a dream sequence that isn't of real world.

I understand that some companies and people find it extremely empowering and accelerating and convenient to plug AI into prod, but I come from diametrically opposite culture of old school DBA / sysadmin mentality, rather than "move fast and break things" modern dev mentality.

Once it was explained to me, authoritatovely, that hallucinations are mathematically impossible to eliminate, there's just no way I'm not "air / human gapping" any kind of LLM from any kind of prod.

I get these headlines are sensationalist and these cases may or may not be extreme or unusual/unrepresentative, but it's stunning to me how many people go through mandatory AI 101 training, are basically made to acknowledge that LLM will make things up confidently, and promptly forget that. I have executives sending me market research that's fully made-up and techies that are saying software is dead AI can make a payroll system in 5 minutes and everybody wanting to plug LLM into everything. And I'm not saying LLM is useless like some people, I use it multiple times a day for various things - I just cannot imagine giving it root / sysadm access to prod system and database :-/

(even The "unhinged apologies" - unless I'm mistaken, that too is basically fancy autocomplete, correct? It's not that AI "acknowledges" or "understands" or "fesses up" when things went wrong, as even technical media presents it as. It's just what training material / RLHF built as statistical response to a mistake. )

rafaelmn•about 1 hour ago
> Once it was explained to me, authoritatovely, that hallucinations are mathematically impossible to eliminate

That's a weak criteria - hallucinations are mathematically impossible to eliminate in humans.

_aavaa_•20 minutes ago
Humans can be held responsible; what are you gonna do to the AI? Wipe the context?
simplyluke•about 2 hours ago
The current sentiment within basically all of silicon valley is to remove every possible guardrail and accelerate AI adoption as fast as possible, consequences be damned.

The uptime of major websites recently should be a tell of how well that's going.

standardly•about 2 hours ago
I've noticed a general decline in performance across several major applications within the past year or so. Not making any accusations yet, because it could be placebo, or coincidence, or selective bias... but I have my suspicions.
JimsonYang•43 minutes ago
Can someone more technical explain the cause of this?

No seperate production and development keys and builds? Seems like a casual mistake-rather than the sensationalist media it’s trying to be

drwl•40 minutes ago
it's spelled out in the linked tweet https://x.com/lifeof_jer/status/2048103471019434248
cheald•about 1 hour ago
If you wouldn't give it to an enthusiastic junior dev, don't give it to AI, period.