
Lee Kang-wook, vice president in charge of packaging development at SK Hynix, speaks at the Artificial Intelligence (AI) Summit during Semicon Korea 2026, held at COEX in Gangnam, Seoul, on Feb. 11. (Yonhap)
SEOUL, Feb. 11 (Korea Bizwire) — SK Hynix, the world’s largest supplier of high-bandwidth memory for artificial intelligence systems, said Wednesday it will deepen its push into customized memory solutions as chipmakers and AI developers demand more tailored performance from next-generation products.
Speaking at Semicon Korea 2026 in Seoul, Lee Kang-wook, a senior vice president in charge of packaging development, said demand is rising for system-in-package designs optimized to specific customer needs as the industry moves toward seventh- and eighth-generation HBM, known as HBM4E and HBM5.
To meet that demand, SK Hynix plans to introduce what it calls the “HBM B·T·S” concept — memory variants specialized for bandwidth performance (B), thermal dissipation (T) and space efficiency (S). The approach reflects a shift away from one-size-fits-all memory chips toward highly customized solutions for AI accelerators and data center systems.
“Customers are increasingly seeking HBM that is specialized in particular areas,” Lee said, adding that advanced packaging technologies would allow the company to respond more flexibly to diverse platform requirements.

On Jan. 8 (local time), the third day of CES, a video showcasing HBM4 is playing at the SK hynix booth at the Venetian Convention Center in Las Vegas, Nevada. (Yonhap)
The company also signaled a technological leap in stacking density. SK Hynix, which has developed proprietary MR-MUF packaging capable of stacking up to 16 layers, expects that hybrid bonding technology will become essential for products exceeding 20 layers.
The industry is currently debating whether 20- and 24-layer stacks can be implemented within a 775-micrometer height limit, recently relaxed from 720 micrometers by the global standards body JEDEC.
SK Hynix was the first to complete development of HBM4 and has established mass production capacity. It is widely expected to supply roughly two-thirds of Nvidia’s HBM4 demand this year, underscoring its central role in the AI chip ecosystem.
According to Counterpoint Research, the company held a commanding 57 percent share of the global HBM market in the third quarter of last year, cementing its lead as demand for AI hardware accelerates worldwide.






