HI version is available. Content is displayed in original English for accuracy.
Advertisement
Advertisement
⚡ Community Insights
Discussion Sentiment
60% Positive
Analyzed from 792 words in the discussion.
Trending Topics
#intel#stack#dram#https#memory#instead#maybe#thing#idea#more

Discussion (19 Comments)Read Original on HackerNews
Get perpendicular: https://www.dailymotion.com/video/x62mja
https://wccftech.com/intel-showcases-its-zam-memory-prototyp...
The connectors on the side indeed look like the letter Z. Maybe it disperses the stronger currents across the stack of the crystals, instead of concentrating.
The closest thing I can think of that's come close to maybe challenging DRAM is HP's memristors but those really didn't pan out (probably too much power consumption).
Pet peeve: stupid analogy seeing how wheels kept being improved throughout the millennia with every new technology. The only thing in common is that it's round.
Similarly, DRAM in any way you see it has been improving to the point of barely being recognizable since the 70s.
That said, DIMMs and the whole bus idea is in dire need of getting a new type of bearing.
Then again, flight itself has obviated—or, rather, introduced—many transit workloads that could be performed by wheeled vehicles, and operates on different principles entirely.
I thought this was going to mean each stack was able to directly talk to the controller, since all stacks are resting on an interposer thing. But actually there is still a logic controller slice at the bottom of the stack, not at a right angle to the stack.
Instead of HBM microbumps between layers there is a more compact/dense TSV ("fusion bonded via-in-one") system. Intel once more showing their strong chiplet packing prowess! The claim is that thermals are still much better somehow, in spite of volumetric cell density increasing (from thinner layers). The demo has 8+1 dram+controller layers.
Every time the recipient hypes the shit out of it, of course.
As far as I can tell, Intel more-or-less pioneered the idea of SSDs being the best storage rather than the cheap storage, for instance. The X25-M and X25-E were absurdly good. Then, once the market was established...they pulled out of it.
Popular science kind of backgrounder (can't vouch for the accuracy/relevancy - details are very scarce): https://www.geeksforgeeks.org/digital-logic/polymer-memory/
I think what's semi-unfortunate is all the swings and misses, especially the cases where it wasn't necessarily a bad idea but Intel gives up too soon;
- Massively parallel simple-ish x86 cores a-la Xeon Phi; okay maybe not the best idea on the surface but I feel like nowadays the opportunities could be more forthcoming with how to reuse parts of that tech (And maybe they do but are just quiet about it... i.e. GPU acceleration)
- Optane. I think the tech would have been cheaper if they made terms for licensing easier, but maybe I'm missing part of the equation...
- This thing where they keep half assing the GPU strategy; Imagine if B70 launched last year alongside the B60 and B50, before DRAM prices went sideways. Or if they didn't take so long to release a >16GB GPU in the first place; that would have built a lot of interest, but instead they finally release a 32GB GPU alongside more bad news for the overall roadmap. The whole situation instead becomes a jarring rollercoaster that makes everyone worry that Intel is gonna kill the project the way everything but CPUs gets killed lately.