Samsung is readying up some pretty groundbreaking tech: stacking memory on a CPU or a GPU to potentially drastically improve performance. Switching to this technique may affect performance, power efficiency, and capacity. Unfortunately, many of us will never directly experience the benefits of this, as Samsung is going to use its high-bandwidth memory (HBM), meaning we won’t find it even in the best graphics cards available.

The tech in question involves a new 3D packaging method that belongs to Samsung’s Advanced Interconnect Technology (SAINT) platform, with this latest iteration being dubbed SAINT-D. Each variant involves a different 3D stacking technology, with SAINT-S stacking the SRAM die on top of the logic die; SAINT-L stacking logic; and finally, SAINT-D stacking HBM memory on top of logic chips, meaning either CPUs or GPUs.

SAINT-D introduces stacking HBM vertically on top of the processor and connecting it through a substrate between the two chips. This is a huge change from Samsung’s current 2.5D packaging approach, which connects the HBM chips to the GPU horizontally with a silicon interposer.

The introduction of 3D packaging could be the first step toward launching Samsung’s next-gen HBM4. Samsung itself refers to SAINT-D as a “DRAM breakthrough for HPC and AI.” The company also described the benefits of using this technique, as cited by The Korea Economic Daily: “3D packaging reduces power consumption and processing delays, improving the quality of electrical signals of semiconductor chips.”

As announced during the Samsung Foundry Forum 2024, the company will offer its new 3D HBM packaging as part of a turnkey service. This means an end-to-end solution where Samsung will both produce the HBM chips and integrate them onto GPUs for fabless companies. As SAINT-D is said to be making its debut this year, and the next-gen HBM4 model is set to arrive in 2025, this new method could make quite the splash in HPC use cases very soon, including various AI uses.

Samsung’s breakthrough doesn’t mean much for consumers — not yet. HBM memory is, as the name itself implies, used in high-performance environments, and to top it all off, this 3D packaging technology is reportedly even more expensive to produce than its predecessors. However,  3D VRAM is an interesting concept. Perhaps if it works out well in data centers, it may one day make its way to our PCs.

Related Posts

New study shows AI isn’t ready for office work

A reality check for the "replacement" theory

Google Research suggests AI models like DeepSeek exhibit collective intelligence patterns

The paper, published on arXiv with the evocative title Reasoning Models Generate Societies of Thought, posits that these models don't merely compute; they implicitly simulate a "multi-agent" interaction. Imagine a boardroom full of experts tossing ideas around, challenging each other's assumptions, and looking at a problem from different angles before finally agreeing on the best answer. That is essentially what is happening inside the code. The researchers found that these models exhibit "perspective diversity," meaning they generate conflicting viewpoints and work to resolve them internally, much like a team of colleagues debating a strategy to find the best path forward.

Microsoft tells you to uninstall the latest Windows 11 update

https://twitter.com/hapico0109/status/2013480169840001437?s=20