
At Samsung’s booth during “SEDEX 2025,” held on Oct. 22 at COEX in Seoul’s Gangnam District, physical samples of the company’s sixth-generation high-bandwidth memory chips, HBM4 and HBM3E, are on display. (Yonhap)
SEOUL, Dec. 4 (Korea Bizwire) — South Korea’s two largest chipmakers, Samsung Electronics and SK Hynix, have completed preparations to supply Nvidia with next-generation high-bandwidth memory (HBM4), positioning themselves for what is expected to be a pivotal year in the fast-expanding AI memory market.
Samsung recently finished its final performance review — known as PRA — for HBM4, signaling that both yield and speed targets have been met and that the chip is ready for mass production. SK Hynix, which completed HBM4 development in September and set up production lines soon after, plans its first shipments before the end of this year, with broader sales ramping up in 2025.
With both suppliers prepared, the remaining trigger is Nvidia’s approval and purchase order. HBM4 will power Rubin, Nvidia’s next-generation GPU slated for release next year. Each Rubin processor is expected to use eight HBM4 stacks, while Rubin Ultra — planned for 2027 — will incorporate as many as twelve.
Rubin’s launch schedule, however, remains the critical variable. Initially expected in the second quarter of 2025, the GPU has faced delays tied to design verification and performance testing. Industry officials now expect a launch in the second half of the year. Memory production typically begins one quarter ahead of the GPU’s release, suggesting HBM4 mass production could begin as early as the second quarter.

At SK hynix’s booth during the 2025 Semiconductor Exhibition (SEDEX) held at COEX in Seoul’s Gangnam District, the company’s sixth-generation high bandwidth memory chip, the HBM4, was on display. (Yonhap)
SK Hynix dominated the HBM market this year, capturing an estimated 60.8 percent share as of the third quarter, far ahead of Samsung’s 17.2 percent and Micron’s 22 percent. But market research firm TrendForce expects Samsung to regain ground in 2025, projecting the company’s HBM market share to surpass 30 percent as it expands shipments of HBM3E and begins supplying HBM4.
Competition Spreads to GDDR and LPDDR
The AI memory battleground is also widening beyond HBM. Demand for graphics DRAM (GDDR) and low-power DRAM (LPDDR) is poised to surge alongside the rapid scaling of AI inference chips.
Samsung, which leads both markets, is expected to supply GDDR7 for Rubin CPX, Nvidia’s cost-optimized inference GPU set to debut in the second half of 2025. The company currently holds roughly 70 percent of the global GDDR market.

Jensen Huang, Nvidia’s chief executive, signed a GDDR7 graphics memory module during his visit to Samsung Electronics’ booth on March 20, local time, the fourth day of the GTC 2025 developers conference. (Yonhap)
Nvidia’s upcoming Vera Rubin AI accelerator will also incorporate SOCAM — an LPDDR5X-based module designed to boost the performance of the next-generation Vera CPU. Nvidia is in discussions with Samsung, SK Hynix, and Micron to secure next year’s SOCAM supply, with Samsung likely to claim the largest share given its more than 50 percent dominance in LPDDR.
Samsung is also preparing to unveil LPDDR6 at CES early next year, positioning the chip for use in mobile devices, on-device AI, and other high-performance, low-power applications.
“With the growing emphasis on efficiency and low-power computing, next year’s AI memory race will be fierce across HBM, GDDR, and LPDDR segments,” said one semiconductor industry official. “As Nvidia finalizes its launch timeline, the revenue trajectory for the major memory makers will become much clearer.”
Kevin Lee (kevinlee@koreabizwire.com)






