The mass production of Samsung Electronics sixth new generation high-bandwidth memory (HBM4) will start next month, and will be delivered to Nvidia.
The shift is an indication of a strategic plan by Samsung to reclaim market share against the market leader SK Hynix who controls about 62% of the AI memory market and is currently the sole HBM supplier to Nvidia.
What HBM4 Means for AI
HBM4 has been designed to accommodate artificial-intelligence and high-performance computing workloads, with initial tests showing a bandwidth of about 2TB/s and a 2048-bit interface that is twice the width of previous HBM generations. Investigators at Trend Force believe that HBM4 would become the foundation of the Nvidia Vera Rubin AI.
Nvidia CEO Jensen Huang said,
Early this month that the company’s next-generation chips, the Vera Rubin platform, is in “full production,” as the U.S. company prepares to launch the chips, to be paired with HBM4 chips, later this year.
Market Impact and competition
After the announcement, the shares of Samsung rose by 2.2%, but those of SK Hynix decreased by 2.9 %, as investors became worried about Samsung becoming part of the HBM supply chain again of Nvidia.
Industry projections, including those from Micron, suggest that the HBM market is expected to rise rapidly through 2026, drawing increased position among memory manufacturers.
This highlights the importance of timings for Samsung as it seeks to strengthen its position in the AI market by 2028.
Future outlook
Samsung expects to make deliveries of full-scale HBM4 between June 2026 and coincide with the rollout of the Vera Rubin platform by Nvidia.
Should Samsung be able to maintain its yield and scale, it may repurchase market share against SKHynix and help eliminate the AI memory bottleneck that has limited the ability to install more data-centers.