Tuesday, November 4, 2025
Starting with the roadmap, SK hynix has two timeframes: the first covers 2026 to 2028, and the other covers 2029 to 2031. For both timeframes, SK hynix is planning a range of HBM, conventional DRAM & NAND products. For 2026-2028, the company is planning HBM4 16-Hi and HBM4E 8/12/16-Hi, along with their custom HBM solution.
SK hynix's Roadmap Positions HBM5/HBM5E, GDDR7-Next, DDR6 & 400+ Layer 4D NAND In 2029-2031 1
The custom HBM solution essentially moves the HBM controller to the HBM Base Die. Other IPs, such as protocol, etc, are also moved to the HBM Base Die. This allows GPU/ASIC manufacturers to increase the compute silicon area. The custom HBM design also reduces interface power consumption. SK Hynix will be collaborating with TSMC for its custom HBM solution and base dies.
SK hynix's Roadmap Positions HBM5/HBM5E, GDDR7-Next, DDR6 & 400+ Layer 4D NAND In 2029-2031 2
Besides this, SK hynix will also roll out a slew of conventional DRAM solutions, such as LPDDR6, and AI-focused "AI-D" DRAM solutions, such as LPDDR5X SOCAMM2, MRDIMM Gen2, LPDDR5R, and 2nd Gen CXL LPDDR6-PIM. For NAND, the standard solutions will include PCIe Gen5 eSSD with up to 245 TB+ QLC capacities, PCIe Gen6 eSSD/cSSD, UFS 5.0, and AI-focused "AI-N" NAND solutions.
SK AI Summit 2025 - DRAM & NAND Roadmap 2029-2031
Moving into the 2029-2031 timeframe, SK hynix will start developing its next-gen HBM5, HBM5E solution, and the respective custom HBM5 solutions. On the DRAM front, SK hynix is going to offer GDDR7-next & DDR6 products alongside 3D DRAM.
SK hynix presentation on AI-N SSDs with diagrams showing AI-N Performance, AI-N Bandwidth, and AI-N Density.
SK hynix memory modules labeled SOCAMM2, LPDDR5R, CXL-CMM, and 24GB LPDDR5X under AI-D Optimization, Breakthrough, and Expansion.
GDDR7-next is interesting since this means that it will be some time before we get to see anything past GDDR7 for conventional discrete graphics cards. The first generation of GDDR7 is currently capped at 30-32 Gbps, and the standard maxes out at 48 Gbps.
We don't expect the standard to be utilized at its fullest potential until at least 2027-2028, so the timeline matches up. Samsung was the first to supply GDDR7 DRAM for NVIDIA's RTX 50 and RTX PRO Blackwell GPUs, with Micron and SK hynix being added to the list a few months later.
Also, with DDR6 positioned for 2029-2031, it looks like it will be a few years before we are going to see anything beyond DDR5 for conventional desktop and laptop PCs.
“SK AI Summit 2025” slide shows HBF use cases and computer type shipment data for 2024, including servers and smartphones.
Memory Oriented Domain Specific Computer Architecture title slide at SK AI Summit 2025 discussing 3D memory/logic stacking.
Chart comparing Processing-in-Memory (PIM) with Processing-Near-Memory (PNM) at SK AI Summit 2025 with examples and advantages.
For future NAND products, SK hynix is planning 400+ layered 4D NAND and HBF "High-Bandwidth Flash" solutions. High-Bandwidth Flash is said to address the AI inference requirements for next-gen PCs, so it will be interesting to see how the tech performs and works in actual use cases.
Full Stack AI Memory lineup
- While memory solutions to date have been centered on computing, in the future, it will evolve in ways to diversify and expand the role of memory – enabling more efficient use of computing resources and structurally resolving AI inference bottlenecks. New memory solutions may include SK hynix’s Custom HBM, AI DRAM (AI-D), and AI NAND (AI-N).
- (Custom HBM) As the scope of the AI market is expanding from commodity to inference efficiency and optimization, HBM is also evolving into custom products from conventional products. Custom HBM is a product that integrates certain functions in GPU and ASIC to HBM base to reflect customer needs. This can maximize the performance of GPUs and ASICs, and reduce data transfer power consumption with HBM thereby enhancing system efficiency.
- (AI-D) DRAM has developed with focus on commodity and compatibility. However, SK hynix is now further segmenting the DRAM to prepare memory solutions best suited to the needs of each segment.
- First, the company is preparing “AI-D O (Optimization)”, a low-power, high-performance DRAM that helps reduce the total cost of ownership and improve operational efficiency. Second, to overcome the Memory Wall, the company is developing “AI-D B (Breakthrough)”, a solution featuring ultra-high-capacity memory with flexible memory allocation. Finally, from the perspective of expanding applications, it is preparing “AI-D E (Expansion)” to extend DRAM use cases into fields including robotics, mobility and industrial automation.
With that said, most of the SK hynix announcements are looking into the future. These are close to 3-4 years away, so a lot can change & we can't wait for the next generation of technologies to come sooner.
By: DocMemory Copyright © 2023 CST, Inc. All Rights Reserved
|