Back to News
Advertisement
Advertisement

⚡ Community Insights

Discussion Sentiment

55% Positive

Analyzed from 2032 words in the discussion.

Trending Topics

#more#don#software#between#should#where#things#thing#problem#complex

Discussion (16 Comments)Read Original on HackerNews

ClawsOnPawsabout 1 hour ago
I'm in a similar position to the OP, unemployed for about 10 months, with tons and tons of applications sent both remote and local, and yeah not sure where this is gonna go or what I'm supposed to do. Also disabled, my eyes don't work so that automatically removes many, many non-software jobs I'd otherwise do from the equation.

Don't even really have anything else to say other than that, but maybe commenting it somewhere helps someone else realize they're not alone. I don't know how that helps you or me, but that's what I got. Maybe there's still something for us somewhere, but it is very difficult to stay motivated, and I don't have an answer.

arkt8about 3 hours ago
More than ever is time to be stoic. Have things but live as having nothing. But as obvious as the author says it was predictable too.

By now... I see in my country high prices for laptops with only 4Gb of Ram and Celerons.

It could do wonderful things if in 2000s people didn't buy the argument that hardware is so cheap so lets write unefficient code. Same hardware that could play an Youtube video in 2000s today cannot even open the website. Electron send hugs...

Now people are mad about AI until when? Oceans be drought like in Oblivion movie?

And professionals? The generation of specialists will pass... and people will blindly depend on Ai soon if the course of things doesn't stop or at least be corrected.

I think the author could have brighter days in future (and still thing in present in some hidden niches) as knowledge will always precious.

The main lesson I have is buy less TI and every buzz promises and find the place where knoledge and craft walk side by side.

donatjabout 3 hours ago
I've come to the conclusion in the last couple years that being the guy who understands how the abstraction works under the hood is treated by companies as more of a liability than a virtue.

More and more places just want Jira tickets done fast instead of someone that's going to push back or question if this is the best way to build some thing. They want the thing, they don't care if it works well. They don't care if it's efficient. They want it now.

We've been moving to React, replacing an internal framework that's worked wonders for us we've been using for over a decade. The biggest part of the move is "hiring".

My general sense is that nobody understands how React works under the hood. The answer I get when I ask questions is generally just "don't worry about it".

Everything is giant overbuilt and terrible because most people never bothered to learn even a single level up from where they do most of their work. The people that do become unhirable. Everything takes hundreds or thousands more cycles and electricity it should because people can't be bothered to understand what they're doing.

rdevillaabout 3 hours ago
> I've come to the conclusion in the last couple years that being the guy who understands how the abstraction works under the hood is treated by companies is more of a liability than a virtue.

This is one of the most alienating things about the modern software engineering industry. Someone who grew up just fucking around with computers since they were 5 is supposedly now on even footing with someone who took a 16 week bootcamp and a Claude subscription and has never seen a terminal before.

I was at a drum and bass show recently and talked to one of the other people there. It was obvious I didn't really listen to that much drum and bass as I couldn't name anybody except the most popular artists. You see peoples' reactions change slightly when they discover you are not really part of their music scene - you're an outsider, or a tourist, or even a poser. That's not even a problem, that's just the way subcultures are - you've either lived and breathed that way of life, or not.

What LLMs are doing is they are automating the manufacture of posers and cultural appropriators at scale - you don't really understand the nooks and crannies of this territory, you never actually lived on IRC or in the bash terminal - but you can sure wave around these oversimplified maps of the territory with all the back alleys and laneways missing, and use your pocket book of translated phrases to pose as a native.

> My general sense is that nobody understands how React works under the hood. The answer I get when I ask questions is generally just "don't worry about it".

The problem in software is it seems that we are losing the ability to distinguish between appropriators of computer geek culture and those who do "speak" programming languages natively. The bar has fallen so low that I can't even expect people to understand the difference between runtime and compile time. Anybody who brings up such advanced and esoteric (read: high school level computing) topics is viewed with scorn, as if their ability to expose ignorance on foundational topics presents an existential (or career) threat.

There's been a rise of anti-intellectualism in software from people with non-STEM backgrounds who actually disdain seeking out and possessing such knowledge. It's utterly useless to study - just like math. I find it harder and harder to locate hobbyists, especially here in Toronto, who bother to go below the abstractions not just because they want to, but because they are compelled to understand.

globalnodeabout 1 hour ago
sounds like youre working at the wrong place. detailed computing knowledge and maths is essential in some industries and like you said, scorned in others. i couldnt think of anything worse to do with my time than spend all day with mba's or webdevs (lol im sorry thats unfair, web development is complex with all the callbacks and sync issues).
jongjongabout 2 hours ago
This has been obvious to me since I graduated with a BIT majoring in 'Software design.' I literally went to university with software design and software architecture being my core interests.

When I graduated, I was shocked to learn that no company cared about any of the architectural concepts that I had learned. UML class diagrams, sequence diagrams, ER diagrams, etc... had been on the way out. At one point, as internet companies where scaling up, there was a brief resurgence of interest in sequence diagrams... Especially as a communication method when explaining complex bugs or complex message-passing scenarios. But it didn't really last. Nowadays most software is riddled with race conditions and deep exploitable architectural flaws. Cryptocurrencies have been victims to many such attacks. Billions of dollars have been lost to race conditions... And that's just the ones which were discovered. They are notoriously difficult to find post-implementation.

The programming primitives that we're using today aren't optimized to avoid race conditions or even try to encourage good concurrency patterns; quite the opposite; they encourage convenient but disorganized parallelization and they're optimized to put the focus on type safety which is a far less concerning issue. A lot of people who were rightly alarmed by gaps in schema validation (which is critical at API boundaries) became overly obsessed with type safety (which is a broader concern). I have built some async primitives for Node.js, nobody cared! NOBODY! Other developers have had the same experience with most other languages. I think only a few niche languages like Elixir actually treated it as important. But nobody even acknowledged that the problem could be remedied in existing languages. It's so bad that it seems as though some people wanted it to be that way.

The term 'concurrency safety' doesn't even exist! Some have a vague idea about thread-safety OK, that's very specific to one particular concurrency primitive... but what about the concurrency of asynchronous logic (much more common nowadays)? I have felt thoroughly suppressed in that regard in my career.

The only voice on the subject of architecture that got through to the 'mainstream' was Martin Fowler (one of the inventors of Agile software development). After that, there was Dan Abramov of Redux fame. Some notable opinionated architecture books were published but none really identified the underlying essential philosophy to good architecture.

The best, most succinct quote I ever read on the subject was from Alan Kay (inventor of OOP) who said "I'm sorry that I long ago coined the term 'objects' for this topic because it gets many people to focus on the lesser idea. The big idea is messaging."

I like that quote for many reasons; firstly because it shows wisdom, secondly, it tells you what the issue is, very simply and, thirdly, it hints at the importance of 'focus' in this discipline where we are saturated with thousands of complex overlapping and partially conflicting ideas.

I think the FP trend was somewhat of a red herring. Same with Type Safety. Yes, they were useful to some extent, there are some really good ideas in there, but people got so caught up in them that the most fundamental area of improvement was ignored entirely. To me, the core value proposition of FP can be reduced down to "pass by value is safer than pass by reference." Consider that in the context of Alan Kay's "The big idea is messaging." - Is an object reference a message? NO! A live instance is not a message! Precisely! His point supports pass-by-value, furthermore, it encourages succinct/minimal parameters.

Good architecture is rooted in 2 core concepts. 1. Loose coupling. 2. High cohesion and you achieve those by separating logic + structure from messaging. The biggest mistake people make it passing around structure and logic as parameters to other logic. You should avoid moving around logic and structure at runtime; only messages should move between objects; the simpler the messages, the better. And note that 'avoid' doesn't mean never but it means you have to be extremely careful when you do violate this principle and there should be a really good commercial reason to do so. I.e. You should exhaust other reasonable approaches first.

kelsier_hathsin43 minutes ago
> only messages should move between objects

Can you provide an example for this?

burakemir18 minutes ago
Say you have a Car, Engine and Dashboard object.

Let's not have dashboard access the temperature by doing `GetSurroundingCar().engine.temperature`

If the dashboard needs to get the temperature from a sensor in the engine, it should be able to "talk" to the sensor, without going through car object.

In ideal OOP, a "method call o.m(...)" is considered a message m being sent to o.

In practice, field access, value and "data objects" etc are useful. OOP purism isn't necessarily helping if taken to the extreme.

The pure OOP idea emphasizes that the structure of a program (how things are composed) should be based on interactions between "units of behavior".

jongjong13 minutes ago
1. Avoid passing live instances (by reference) to other instances as much as possible. Because you don't want many instance references to be scattered too widely throughout your codebase. This can cause 'spooky action at a distance' where the instance state is being modified by interactions occurring in one part of the code and it unexpectedly breaks a different module which also has a reference to that same instance in a different part of the codebase. The more broadly scattered the reference is throughout the codebase, the harder it is to figure out which part of the code is responsible for the unexpected state change. These bugs are often very difficult to track down because stack traces tend to be misleading because the culprit may be in a completely different part of the codebase.

2. Avoid overly complex function parameters and return values. It increases the coupling of your module with dependent logic and is often a sign of low-cohesion. The relationship between cohesion and coupling tends to be inversely proportional. If you spend a lot of time thinking about cohesion of your modules (I.e. give each module a distinct, well-defined, non-overlapping purpose), the loosely-coupled function interfaces will tend to come to you naturally.

The metaphor I use the most to explain this is:

If you want to catch a taxi to go from point A to point B, do you bring a steering wheel and a jerry-can of petrol with you to give to the taxi driver? This is an easy to understand example of a low cohesion design. There are improper overlapping responsibilities between you and the taxi service which add friction. Sometimes it's not so simple, the problem is not so familiar, and you really need to think it through. For important architectural decisions, I personally like to design systems on paper and sleep on it for several days to ensure that the design is optimal.

globalnode36 minutes ago
nice post, lately ive been dealing with concurrency, between threads and processes. trying to keep it cross platform as well, its a lot to learn. if you have large buffers and want to keep some semblance of performance, its VERY interesting understanding all the transfer mechanisms and cache levels involved. i feel these are the sorts of things my education skipped, it was all very focused on the static structure of objects not the dynamics of data transfer.
skydhashabout 3 hours ago
> More and more places just want Jira tickets done fast instead of someone that's going to push back or question if this is the best way to build some thing.

That's one thing I never care to do unless I'm the one making the technical decisions. What I do is to build the thing, but with defensive programming in place. I take care of making that my code is good, then harden any interface so that I can demonstrate that I'm not the cause for new bugs. People will be careless, so make sure that you have blast doors between your work and theirs.

And I do take time to learn about the abstractions of the new shiny tools, even when it's overengineered. Going blind and making mistakes is not my cup of tea.

oxag3nabout 2 hours ago
"Any problem in computer science can be solved with another layer of indirection, except of course for the problem of too many layers of indirection." Bjarne Stroustrup

That's why you see hundred level call stacks, polymorphism with a single implementation and still errors are hidden or root causes hidden behind "exception caught".

hamashoabout 3 hours ago

  “Duplication is far cheaper than wrong abstraction."
shadowgovtabout 3 hours ago
Oof. There are two pieces to this story. One is great and one his heartbreaking.

The fact that modern tech has disintermediated people with problems to solve from the need for a "priest class" to commune with the machine to solve the problem is a great thing. It's the goal. The more we do it the better we are making the world for humans.

... the fact that people need to work to eat or provide anything above a subsistence quality of life is not only tragic, it's increasingly abhorrent in a world where automation and simplification via machines has freed up this much raw resource and free time.

If we're pitting LLMs against people's ability to provide for their families, we have lost the thread on why we're doing any of this.

arkt8about 3 hours ago
Not he automation, but the way... we gone farther since agricultural and energy domestication... but the profit as main director is less than suboptimal, it is tragical. Having known about many accidents in complex systems is a madness to see things at this point in the most complex of systems that is society.
hgyyyabout 2 hours ago
Profit is what drives the survival of the firm to be fair

However there are tasteful ways of doing it. And google and meta in particular certainly are not.