ZH version is available. Content is displayed in original English for accuracy.
Advertisement
Advertisement
⚡ Community Insights
Discussion Sentiment
71% Positive
Analyzed from 5143 words in the discussion.
Trending Topics
#computer#got#remember#more#software#still#used#https#pentium#memory

Discussion (166 Comments)Read Original on HackerNews
My parents didn't have a lot of money, but my great-grand father passed and they used some of the inheritance to buy the computer. I was instantly hooked. In hindsight I see how much of a gift my family gave me.
The announcement reminded me of article John Dvorak wrote around the same time. 1GB hard drives had just come out, and he asked what all the extra space would be used for. Even as a young teenager, I remember thinking how short sighted that comment was. That was before I realized how the tech press tends to get stuck in local optimizations, and can't understand the bigger picture.
It's all a good reminder that cutting edge today doesn't stay cutting edge very long, and the world figures out how to squeeze every ounce ounce of power out of hardware. (Also, yes, that leads to bloat...)
[1] https://en.wikipedia.org/wiki/I486SX
[2] https://en.wikipedia.org/wiki/John_C._Dvorak
True for many, many of us, I suspect. My family bought a 286 in the early 90s and it cost something like $2000 CAD then, which is nearly $4000 now; but salaries were lower then, this would have been something like 5-6% of my single income family's yearly post-tax earnings for the year, and if you think about it as the % of "disposable" income it was probably more like 60% of it for the year.
Obviously it paid off in that it set me on the path for my career, hard to make any other investment as good as that, but who would have known that at the time? I'm glad that there were so many ads positioning computers as being educational and not just game machines; even though in reality I think it was learning about the computer to make the games work that taught me way more than any educational software ever did.
I’ve been thinking a lot about these inflation-adjusted prices due to the big Apple Computer anniversary — an Apple // cost $5000 in 2026 dollars, meanwhile a $600 Macbook Neo cost $150 in 1980 cash!
What helped me reconcile this was an observation that we’ve inverted the prices of necessities and luxury goods. Rent and mortgage in particular were a much smaller slice of income back then, but luxury goods were very expensive, so one would save up for a year or two to buy a new TV or a computer for the kids.
Now the necessities take a much larger slice of our income, but TVs and computers are incredibly cheap. It takes very little money to get a nice computer, and not-buying it barely makes a dent in the bills. This isn’t a good thing.
I do disagree a little with your observation regarding the industry “squeezing every ounce of power out of hardware”. Beyond local LLM stuff, there’s basically nothing a modern computer can comfortably do that any laptop since the mainstreaming of SSDs can’t.
Office tools and web browsing are less demanding.
But you can get way better results with the lowest end computers than you could years ago. Back in the 90s my grandfather used 3DS Max to map out his future apartment's rooms and start planning furniture, using renders to get an idea of how sunshine would look like at different times, etc. At the time, he did this on an expensive 486 that would take an entire day to render some of those visuals. Nowadays I can do the same with a free copy of Blender and any reasonably modern integrated GPU in probably under an hour.
Gotta tack on to this thread showing appreciation for parents. We could never afford new computers in the 90s, but luckily my dad could bring home obsolete equipment from work. We were thus always at least a generation behind. I remember my friend's Pentium feeling like sci-fi compared to our 386, but my goodness it completely molded my life!
Later, towards the end of the 90s, those sci-fi Pentiums were obsolete, so I got a few to run "that weird Linux stuff" on. Since it was considered junk, nobody cared what I did with it. To this day, if I happen to hear Metallica play and there's early winter's first smell of snow in the air, my mind will be transported back to that school night I secretly stayed up wayyy too late and discovered SSH for the first time. Haven't looked back.
Thank you, dad! I just hope general computing devices owned by regular people are still natural by the time my children come of age.
I wanted to link his columns "Microsoft Dot Nyet" and "New Architecture Needed" from circa 2000-2001 but it turns out they have been memory-holed. They should be somewhere in the wayback machine.
EDIT: At least one of them has not been deleted, just his name has been removed
https://www.pcmag.com/archive/new-architecture-needed-32570
Pretty much the only thing I agree with is that computer architecture could use a complete rework (both from a software as well as hardware side, though primarily the former); as well as said rework being basically impossible in practice.
Mine neither although the grandparents were moderately wealthy but my mom understood very early on that it was a match for me and that computers would really take off.
Fun story: first BASIC I ever got was an Atari 2600 cartridge that came with some key of a "keyboard" in two parts you'd plug in the joystick ports. When my parent bought that Atari 2600 they tried it and spent the entire night playing "Tank Attack" on the TV in their bedroom. She only told me that years later.
Then as I was writing tiny BASIC programs on the Atari 2600 gaming console, she realized I needed a "real" computer, so she bought me an Atari 600 XL a bit later. Then I began salivating on the neighbours' Commodore 64, which I could see trough a window. And she thought: "If I buy the exact same computer as the neighbours, maybe my son and the neighbours shall become friends!". 42 years later one of our neighbor just went to visit my brother in another country and his brother we exchange Telegram messages nearly daily.
Then the Amiga. Then the 386, 486, etc.
What a mom. RIP.
Mine was the 486 DX 2/66.
The trouble with the 486 SX 25 was IMO that a fast 386 easily beat it. I was part of the demo scene back then and wanted to compete with the likes of Dust, Future Crew.
And: Doom! It could be displayed and run in 800x600 if I remember correctly on a DX 2/66.
The 486 DX2 66MHz was the target platform for gaming during almost two years (1992-1994). That was an huge achievement back in the days to be at the top that long.
Before it, you could claim that a 68040 was kinda-sorta keeping up with the 486 and that the nicer design and better operating systems of other computers made up for the delta in raw performance, but the DX/2 66 running Doom was the final piece of proof that the worse-is-better approach of using raw CPU grunt to blast pixels at screen memory instead of relying on clever custom circuitry was winning.
Faced with overwhelming evidence, everyone sold their Amiga 1200s and jumped ship to that hated Wintel platform.
And sometimes the DX/4 100Mhz would be slowest of all those at 25Mhz bus.
My peripherals seemed to take it. My graphics output showed some slight glitches, which I was OK with for the speed.
However, I think it was a bit unstable and would fail a correctness challenge like compiling XFree86 or the Linux kernel, which were like overnight long runs. Must have been some bit flips in there occasionally. I seem to recall that once that reality settled into my brain, I went back to the clock tripler config.
The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
Reading in hindsight there was probably too many structural issues for Commodore to remain competitive anyhow, but an alt-history where they would've seen the needs for 3d rendering is tantalizing.
1: https://news.ycombinator.com/item?id=47717334
The intention was good, but the Akiko chip was functionally almost useless. It was soon surpassed by CPU chunky to planar algorithms. I don't think it was ever even used in any serious way by any released games (though it might have been used to help with FMV).
I agree. Unfortunately, even with chunky graphics and/or 3D foresight, 68k would still have been a dead end and Commodore would still have been mismanaged into death. It’s fun to dream though…
(Also, a Pentium 60 is barely faster than a DX/2 66 at many tasks — it is a Bad Processor — but that’s another conversation ;)
I think it was 1994. It was a loaded 486 with the best 17" CRT monitor money could buy at the time. I think he spent over $7000.
https://www.silverstonetek.com/en/product/info/computer-chas...
Suddenly, it was possible to imagine running advanced software on a PC, and not have to spend 25,000 USD on a workstation.
[1]: https://docs.redhat.com/en/documentation/red_hat_enterprise_...
https://ftp.gwdg.de/pub/gnu/www/directory/all/rhide.html
:-)
And you could use VESA linear framebuffer above 256KB - this was a breakthrough back then :-))
https://flint.cs.yale.edu/feng/cos/resources/BIOS/procModes....
https://en.wikipedia.org/wiki/Protected_mode
While emulating an FPU results in a huge performance penalty, it is only required in certain domains. In the world of IBM PCs, it was also possible to upgrade your system with an FPU after the fact. I don't recall seeing this option for IBM compatibles. While I have seen socketed MMUs on other systems, I don't know whether they were intended as upgrade options.
To a significant degree, the 486DX2 was the primary computing platform that created the foundation I needed to learn computing at depth and enabled my later career, and really set many of the formative moments in my life. Thanks Intel, even though you suck now as a shadow of your former self you were a beast in the 90s.
Back then, 10 years of technological advancement made a huge difference. Today, you can get by just fine with a 2016-era laptop.
The CPU I'm working with is Celeron M 900MHz single core no HT struggling to build wheels for python (several hours)
Yeah node is usually my go to love JS
Flat framebuffers and "powerful" CPU's also enabled easier software rendering (Doom/Duke) of 3d, compared to the Amiga where writing textured rendering for an Amiga is a PITA due to video memory layout with separted bitplanes spreading bits of each pixel into different memory locations (the total memory bandwidth reduction in 1985 by using 5 or 6 bitplanes became a fatal bottleneck at this point).
It wasn't really always full framerate though and the 2d chipsets did help in "classic" actiongames that were still much in the rage.
The Pentium further widened the gap, but at the same time consoles gained hardware 3d acceleration (PSX/Saturn/Jaguar) yet the Pentium could do graphics better in some respects (As shown with Quake).
Once 3d accelerators landed, PC's has more or less constantly been ahead apart from when it comes to price (and comfort/ease).
It was pre-internet obviously so obtaining software was very difficult. For years when I was learning assembler I was using a so called "monitor cartridge" that did simple assembly/disassembly, but it didn't support labels and such. I could read about software like "Meta Assembler" that let you use labels and variables and think "wow, I could do so much stuff with that..."
My first PC was sometime in late 90s. A Celeron 233MHz with Windows 95. I wasn't a huge fan of Windows back then. I remember when one of the pc magazines I got had RedHat Linux install CDs. I liked it from the start. The fact my software only modem and Lexmark printer didn't work got me into kernel programming :-)
Fun to think of it now, but I prefer 2026 a 100x :-)
In other words, faster hardware was needed because the quality and performance of the software dropped. I was doing spell-checking with WordStar on an CP/M Apple II with zero lag -- and WordStar fit on one side of a 5' floppy.
Word 97 also had as-you-type grammar checking, which wordstar never had. Wordstar did have an add in extra cost grammar checker whose name escapes me at the moment. But again, it was never real time.
Yes, programs have become bloated, but it is worth it to compare apples to apples.
One might argue that real time isn’t necessary, and one might be right. But that’s different from poorly written.
I'd argue it was a combination of "now we have more processing power lets see how we can use it up" and "we don't have to make so many hard design and programming decisions thanks to the extra power", with the result being that you "had" to get the new chips in order to run the new software that was replacing the old software
Repeat that a number of cycles and we wound up with Windows Vista ;)
Since we're discussing word processors, I would say that WordPerfect5 for DOS was the best word processor I've used to date (Pages on Mac comes in second). It did almost everything that Word does today in terms of word processing (not page layout but Word is terrible at that anyway, you really need InDesign to do that properly), was fast and easy to work with (keyboard shortcuts for operations is much faster than a mouse/GUI), and didn't require nearly as much processing power.
That's really long compared to 1yr refresh cycles we have today with phones etc.
It was a life-changing machine.
Ordered, I believe, from the depths of a Computer Shopper magazine.
- tinkered for HOURS to get enough EMM/XMM memory by tweaking Config.Sys & Co to get whatever game running (and having dedicated boot options configured, because you could unload some drivers from mem and could then run other games)
:-D
• Ran my first real UNIX at home on a PA-RISC (HP 9000-715/75 with HP-UX 9.03 and 96 MB RAM) in 1997, 20" color CRT.
• Today, Linux is still here, but on a 2-CPU, 140-core AMD server with 2 TB RAM, hundreds of TB NAS and a 40" TFT... (and it still takes too long to open the bloated Web browser! - keenly awaiting Ladybird to the rescue in August.)
chromium browsers launch pretty fast. If you're talking about memory usage, Ladybird isn't aimed at minimal memory usage from what I've seen.
I built a 486 Compaq Novell server for the company I worked for and named it Godzilla - gives a sense of how the 486 was seen.
The 386 SX was a crap, 16 bit wide bus IIRC.
If you were running 16-bit software they were little slower than a 386DX at the same clock and significantly faster than a 286 because of higher clocks (286's usually topped out at 12MHz though there were some 16MHz options, the slowest 386s were running at 16MHz with some as fast as 40MHz), but also in part, when not blocked by instruction ordering issues, to the (albeit small by modern standards) instruction pipeline which the 286 lacked.
32-bit software was a lot slower than on a DX because 32-bit data reads and writes took two trips over the 16-bit data bus, but you could at least run the code as it was a full 386 core otherwise (full enhanced protected mode, page based virtual memory, v8086 mode, etc).
The SX also only used 24 bits of the address bus, limiting it to 16MB of RAM compared to the original's 4GB range, though this was not a big issue for most at the time.
I know my 286 you could pair with a 287 next to it.. not sure if it really made a difference you could discern outside of hyper-specific uses though.
Very little, if any, “home” or small-business software would make use of a floating-point unit though (maybe some spreadsheet apps did?). The most common use for them was CAD/CAM, and those doing scientific modelling without a budget that would allow for less consumer-grade kit.
The lack of imagination is just disturbing.
In the 2000s through now we've mostly had improvements - 4k Youtube is much better than realplayer, but it's still just "online video". AI is definitely a "new" thing and it's somewhat awoken a similar spirit to the 80s/90s - but not the same breadth. Dad bringing home a computer because he wants to do spreadsheets and you finding it can run DooM or even play music.
Anyway a while ago I was reading an article authored by a guy who lived through the same era as I grew up laughing at modern developers whom he had asked to size a machine to add all the integers from 1 to 100. Setting aside that 7 year old Gauss found the closed form of that sum (the triangular number formula) in about ten minutes and got the correct result of 5050 without any of the arithmetic busywork, it’s totally insane what some of the answers involved… with some involving the terms “Big Data” (yes, it was that era of hype, before “Crypto” and “AI”) and some even (allegedly) mentioning ‘clusters’. I really wish I could find you a link.
The Pentium is the first one, I think, that this didn't happen, because by then it turned out that people need a computer that can do what they are currently doing—but faster—much more often than they need servers.
All we really have to look forward to in the future of increasing-performance personal computing is doing the same things as yesterday, but doing them faster.
The future after today will probably turn out more interesting than that, of course, but we can't know that until it happens.
And the future after 1988 certainly turned out to be a very interesting time in computing -- but they had no idea what was in store. Perhaps you can use your time machine to go back and let them know?
Everyone - everyone knew it was the start of a revolution.
Nowadays, 486 computers are getting rare and relatively expensive. CPUs themselves are 25, 30, 40, sometimes 50 bucks on eBay. Whole working systems are in the low hundreds, and fully working 486 laptops can fetch 400 or 500 bucks.
sigh
Through the magic of saying something different in actuality, which really ended up being proven incorrect. From the blogpost above, verbatim, italicizing the relevant bits:
> Writing in the May 8, 1989 issue of Infoworld, Michael Slater warned that the sixfold speed increase seen from 1981 to 1989, going from 5 MHz to 33 MHz, would not be repeated.
[0] https://en.wikipedia.org/wiki/Dennard_scaling
Played some awesome games, like DOOM, Wolfenstein. Later duke3d was the shit. But i cant remember if i run on the same setup or something newer.