Samsung May Benefit from OpenAI-AMD Chip Deal

Samsung may receive a massive order of HBM chips thanks to a new deal between OpenAI and AMD. The ChatGPT maker’s next-generation AI infrastructure will use AMD GPUs, and Samsung is in contention to supply high-performance memory for those. This will help the Korean firm boost revenue in its memory business while strengthening its presence in the HBM space.
OpenAI and AMD’s collaboration could boost Samsung’s high-performance memory sales
HBM has become an integral component of advanced computing, opening opportunities for memory makers to sell more high-performance products. Recently, AMD and OpenAI announced a major collaboration to build the infrastructure that will accelerate the future of high-performance and AI computing.
As part of the agreement, OpenAI will deploy multiple generations of AMD Instinct GPUs to power next-generation AI infrastructure. The total power usage will reach 6 gigawatts, with the first 1 gigawatt of AMD Instinct MI450 GPUs scheduled for deployment in the second half of 2026.
These next-generation GPUs require high-performance memory to operate efficiently, and Samsung is the potential supplier for this solution. AMD has previously confirmed that its 2026 MI400 series will use HBM4 chips. However, it did not disclose the name of the supplier.
In the HBM market, the three main players are SK Hynix, Samsung, and Micron. So, AMD may choose any of these companies for the HBM4 chips or even source from all three. That said, Samsung is already supplying its 12-layer HBM3E chips to AMD for the latest Instinct MI350 series, which includes the MI350X and MI355X GPUs. It must be optimistic about securing a deal for the next-generation MI400 series as well.
Meanwhile, Samsung is developing HBM4 chips with plans to start mass production next year. Industry analysts expect that the company may supply these chips to Nvidia in the second quarter of 2026.










