Samsung HBM4 May Take Exclusive Spot in Nvidia’s High-End AI Chip

Samsung recently began commercial shipment of its 6th-generation high-bandwidth memory, HBM4, to key clients. A fresh report now claims that the company could exclusively supply HBM4 for Nvidia’s top-tier version of the Vera Rubin AI chip. Nvidia is likely to chose Samsung because its HBM4 offers industry-leading performance.
Nvidia reportedly taps Samsung’s HBM4 for its high-performance Vera Rubin
According to a report from Chosun, Nvidia plans to introduce two categories of its next-gen HBM4-powered AI chip. The first is a general product line that focuses on stability, while the second is a high-performance line for AI infrastructure that requires greater speed. Samsung’s HBM4 operates at a consistent 11.7 Gbps, 46% higher than the current industry standard of 8 Gbps. More importantly, the maximum speed can reach up to 13 Gbps, making it suitable for advanced AI workloads.
Nvidia is reportedly working with Samsung to ensure that the HBM4 memory delivers peak performance for its high-end Vera Rubin NVL72 model. The Korean firm’s decision to use more advanced technology than its rivals appears to be paying off. For example, Samsung’s HBM4 uses 10nm 6th-generation (1c) DRAM, while competitors are sticking with the older 10nm 5th-generation (1b) DRAM.
Of course, demand for high-end Nvidia AI chips could be lower, as they could cost more than general-purpose models. As of now, the share of Nvidia high-end AI chips remains unclear. It mainly depends on the investment strategies of key tech companies such as OpenAI, Google, Microsoft, Meta, and Amazon.
Meanwhile, SK Hynix, which dominates the HBM space, has already established a strong position during the HBM3E era. The company may not supply its HBM4 for Nvidia’s high-end AI chips, but it may still likely capture a larger share of the general-purpose model than Samsung and Micron. Even so, industry watchers expect the market could see a shake-up once Samsung’s 1c DRAM yield reaches a mature level.










