Samsung clears Nvidia hurdle for 12-layer HBM3E supply, setting stage for HBM4 battle - KED Global

2025-09-19 08:34:00 英文原文

作者:KED Global

Initial supply volumes are limited, but the qualification test pass is expected to support Samsung in the heated AI chip race

A Samsung silicon wafer
A Samsung silicon wafer

Jeong-Soo Hwang, Chae-Yeon Kim and Eui-Myung Park 3

Samsung Electronics Co. has won long-awaited validation from Nvidia Corp. for its latest high-bandwidth memory, clearing a critical hurdle in the race to supply chips powering the next wave of artificial intelligence hardware.

The South Korean tech giant recently passed Nvidia’s qualification tests for its fifth-generation 12-layer HBM3E product, according to people familiar with the matter on Friday.

The approval comes about 18 months after Samsung completed development of the chip and follows a string of failed attempts to meet Nvidia’s demanding performance standards.

The milestone marks a symbolic recovery of technological credibility for Samsung in a field that has become one of the most strategically contested corners of the semiconductor industry.

Nvidia is the world's top AI chip designer
Nvidia is the world's top AI chip designer

Nvidia’s flagship B300 AI accelerator, as well as MI350 from Advanced Micro Devices Inc. (AMD), are among the systems set to deploy the high-capacity memory.

ALREADY SUPPLYING 12-STACK HBM3E TO AMD

Samsung had already shipped HBM3E 12-stack chips to AMD, but Nvidia, the dominant buyer of advanced memory for AI workloads, had remained out of reach.

Industry officials said the breakthrough owed much to a decision by Jun Young-hyun, Samsung's chip business head and vice chairman, to redesign the DRAM core for HBM3E earlier this year, addressing thermal performance issues that had dogged earlier versions.

Nvidia CEO Jensen Huang discusses the company's new GPU released in 2024 (Courtesy of Yonhap)
Nvidia CEO Jensen Huang discusses the company's new GPU released in 2024 (Courtesy of Yonhap)

Volumes of 12-layer HBM3E chips to be supplied to Nvidia are expected to be relatively small, as Samsung is the third supplier to secure approval, following SK Hynix Inc. and Micron Technology Inc., sources said.

“For Samsung, the supply is less about revenue and more about pride,” said an industry executive. “Recognition from Nvidia means its technology is back on track.”

REAL BATTLEGROUND: HBM4

The real battleground, however, is already shifting to the next generation.

HBM4, the sixth iteration of high-bandwidth memory, is scheduled to debut in Nvidia’s next-generation graphics architecture Vera Rubin, the successor of the Blackwell AI chip, next year.

(Graphics by Dongbeom Yun)
(Graphics by Dongbeom Yun)

Samsung is aiming to close the gap with rivals by deploying its most advanced 10-nanometre-class DRAM (1c) and a 4 nm logic die produced through its foundry, compared with the 1B DRAM and 12 nm logic processes adopted by competitors.

Early performance indicators are encouraging for Samsung.

Nvidia has asked suppliers to push HBM4 data transfer speeds beyond 10 gigabits per second, well above the current industry standard of 8 Gbps.

Samsung has demonstrated 11Gbps, outpacing SK Hynix’s 10 Gbps and leaving Micron struggling to meet the requirements, according to people briefed on the situation.

Samsung plans to ship large volumes of HBM4 samples to Nvidia this month, aiming to secure qualification at an early stage.

AMD MI350 series (Screenshot captured from AMD's website)
AMD MI350 series (Screenshot captured from AMD's website)

IN HBM4 TALKS WITH NVIDIA, BROADCOM, GOOGLE

In April, the Korean chipmaker said it is in talks to supply customized sixth-generation HBM4 chips to major AI chipmakers, including Nvidia, Broadcom Inc. and Google LLC.

Samsung said it could begin supplying HBM4 chips in large volumes to its clients as early as the first half of 2026.

Samsung is currently lagging behind its crosstown rival SK Hynix in the HBM race, having failed to secure deals for its advanced memory chips with Nvidia.

SK Hynix is the primary provider of the latest HBM chips to the US AI chip designer.

Samsung's 12-layer HBM3E 12H chip
Samsung's 12-layer HBM3E 12H chip

To catch up with SK Hynix, Samsung last year teamed up with its foundry archrival, Taiwan Semiconductor Manufacturing Company Ltd. (TSMC), to jointly develop HBM4.

Analysts said that success in the next round of HBM4 chip testing would allow Samsung to claw back market share in a memory segment central to AI computing.

Asked about its 12-layer HBM3E chip supply to Nvidia, Samsung said it does not confirm or comment on any deals with its clients.

Write to Jeong-Soo Hwang, Chae-Yeon Kim and Eui-Myung Park at hjs@hankyung.com

In-Soo Nam edited this article.

SK Hynix develops world’s 1st HBM4, ready for mass production

SK Hynix develops world’s 1st HBM4, ready for mass production

(Courtesy of SK Hynix) SK Hynix Inc., a major supplier of Nvidia Corp., said on Friday that it has completed the development of the world’s first HBM4 and is ready to begin mass production of the most advanced AI chip.The announcement comes about six months after the South Korean chipmake

Samsung supplies HBM3E to AMD’s new accelerators

Samsung supplies HBM3E to AMD’s new accelerators

AMD MI350 series (Screenshot captured from AMD's online news release)  Samsung Electronics Co. has delivered the fifth-generation high-bandwidth memory (HBM) to a major AI chip maker, Advanced Micro Devices Inc. (AMD), raising expectations that it may also supply its high-performance memor

Samsung in talks to supply customized HBM4 to Nvidia, Broadcom, Google

Samsung in talks to supply customized HBM4 to Nvidia, Broadcom, Google

Samsung's HBM3E chip Samsung Electronics Co., the world’s leading memory chipmaker, is in talks to supply customized sixth-generation high-bandwidth memory, or HBM4, chips to major AI chipmakers, including Nvidia Corp., Broadcom Inc. and Google LLC.The South Korean tech giant expects to s

Samsung Electronics, TSMC tie up for HBM4 AI chip development

Samsung Electronics, TSMC tie up for HBM4 AI chip development

TSMC is the world's largest foundry player TAIPEI – South Korea’s Samsung Electronics Co., the world’s largest memory chipmaker, is partnering with its foundry rival Taiwan Semiconductor Manufacturing Co. (TSMC) to jointly develop a next-generation artificial intelligence chip

Samsung, SK Hynix up the ante on HBM to enjoy AI memory boom

Samsung, SK Hynix up the ante on HBM to enjoy AI memory boom

Samsung memory business chief Lee Jung-bae at Semicon Taiwan 2024 TAIPEI – Samsung Electronics Co. and SK Hynix Inc., the world’s two largest memory chipmakers, are racing to supply their advanced DRAM chips to their clients, including Nvidia Corp. to enjoy the Al boom.Executives fr

(* comment hide *}

关于《Samsung clears Nvidia hurdle for 12-layer HBM3E supply, setting stage for HBM4 battle - KED Global》的评论


暂无评论

发表评论

摘要

Samsung Electronics has passed Nvidia's qualification tests for its fifth-generation 12-layer HBM3E product, marking a significant milestone in the AI chip race. Although initial supply volumes to Nvidia will be limited due to Samsung being the third supplier after SK Hynix and Micron, this approval is crucial for Samsung's technological credibility. The focus is now shifting to HBM4 development, with Samsung aiming to secure early qualification by supplying large samples of its advanced 10-nanometre-class DRAM chips to Nvidia in coming months.