Samsung to Expand AI-powered Memory Portfolio | Be Korea-savvy

Samsung to Expand AI-powered Memory Portfolio


This photo provided by Samsung Electronics Co. on Aug. 24, 2021, shows the company's HBM-PIM products.

This photo provided by Samsung Electronics Co. on Aug. 24, 2021, shows the company’s HBM-PIM products.

SEOUL, Aug. 24 (Korea Bizwire)Samsung Electronics Co. on Tuesday said it aims to expand its memory products powered by its artificial intelligence (AI) engine as the South Korean tech giant is pushing to increase its technology leadership in the semiconductor market.

Samsung, the world’s largest memory chip supplier, showcased its latest processing-in-memory (PIM) technology at Hot Chips 33, an annual semiconductor conference for microprocessors and integrated circuit innovations.

PIM is a technology that integrates an AI engine in the memory core to process some of the logic functions, enhancing high-speed data processing in supercomputers and AI applications.

In February, Samsung introduced the industry’s first high bandwidth memory (HBM) PIM, the Aquabolt-XL, that incorporated the AI processing function into its HBM2 Aquabolt.

Its HBM-PIM has since been tested in the Xilinx Alveo AI accelerator, which boosted system performance by 2.5 times, while reducing energy consumption by more than 60 percent.

“Samsung plans to expand its AI memory portfolio by working with other industry leaders to complete the standardization of the PIM platform in the first half of 2022,” it said.

“The company will also continue to foster a highly robust PIM ecosystem in assuring wide applicability across the memory market.”

This photo provided by Samsung Electronics Co. on Aug. 24, 2021, shows its Acceleration DIMM product.

This photo provided by Samsung Electronics Co. on Aug. 24, 2021, shows its Acceleration DIMM product.

At the latest Hot Chips conference, Samsung introduced the Acceleration DIMM (AXDIMM) that allows AI processing in the DRAM module.

It minimizes large data movements between the CPU and DRAM, thus improving energy efficiency of AI accelerator systems.

“Currently being tested on customer servers, the AXDIMM can offer approximately twice the performance in AI-based recommendation applications and a 40 percent decrease in system-wide energy usage,” Samsung said.

The company also introduced its LPDDR5-PIM mobile memory technology, which brings AI processing to devices without data center connectivity.

“Simulation tests have shown that the LPDDR5-PIM can more than double performance while reducing energy usage by over 60 percent when used in applications, such as voice recognition, translation and chatbot,” Samsung said.

(Yonhap)

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>