Samsung Unveils Low-Power Server Memory Aimed at Next-Gen AI Data Centers | Be Korea-savvy

Samsung Unveils Low-Power Server Memory Aimed at Next-Gen AI Data Centers


With speeds up to 1.25 times faster and 25% better power efficiency compared to the previous generation*, premium low-power DRAM LPDDR5X is transcending mobile devices. (Image from Samsung Semiconductor webpage)

With speeds up to 1.25 times faster and 25% better power efficiency compared to the previous generation*, premium low-power DRAM LPDDR5X is transcending mobile devices. (Image from Samsung Semiconductor webpage)

SEOUL, Dec. 18 (Korea Bizwire) — Samsung Electronics Co. said Thursday it has developed low-power server memory solutions capable of supporting continuous artificial intelligence (AI) workloads at large-scale data centers, as it seeks to lead the AI infrastructure market.

The company recently presented samples of its new small outline compression attached memory module 2, SOCAMM2, a low-power double data rate (LPDDR)-based server memory module designed for AI data centers, to customers including Nvidia Corp., according to a post on Samsung Electronics’ newsroom.

Samsung said SOCAMM2 delivers higher bandwidth, improved power efficiency and flexible system integration, enabling AI servers to achieve greater efficiency and scalability.

This photo provided by Samsung Electronics Co. shows its latest small outline compression attached memory module 2, SOCAMM2. (Yonhap)

This photo provided by Samsung Electronics Co. shows its latest small outline compression attached memory module 2, SOCAMM2. (Yonhap)

The company added that its cutting-edge LPDDR technology and collaboration with Nvidia have helped position it at the forefront of the SOCAMM2 market, which is expected to emerge as a key AI memory chip alongside high bandwidth memory (HBM).

Samsung said it is working closely with Nvidia to optimize SOCAMM2 for Nvidia’s accelerated computing infrastructure through ongoing technical cooperation, ensuring the memory meets the responsiveness and efficiency requirements of next-generation inference platforms.

“Samsung has been contributing to this work alongside key partners, helping to shape consistent design guidelines and enable smoother integration across future AI platforms,” the company said.

(Yonhap)

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>