Samsung’s 12-Layer HBM3E Power AMD’s MI350 AI Accelerator Line

Samsung may have failed to supply its 12-layer HBM3E chips to Nvidia, but it has succeeded in securing a deal with rival AMD. AMD recently confirmed that its next-generation accelerator, the MI350 series, will feature Samsung’s 12-layer HBM3E chip. This deal could help the Korean firm to regain momentum in the growing HBM market.
Samsung supplies 12-layer HBM3E chips to AMD
Samsung has invested heavily in advancing its HBM process, but the company has repeatedly failed to secure major orders. Recurring issues related to heat management and power consumption have led to multiple setbacks. Thankfully, it has now attracted one of the biggest AI chipmakers in the industry.
AMD recently launched its Instinct MI350 series for data centers, which includes MI350X and MI355X GPUs. These new GPUs use the company’s CDNA 4 architecture and aim to fulfill the demands of modern AI infrastructure. The American company also said that it uses 288GB HBM3E on next-generation AI accelerators from Samsung and Micron.
So, AMD has diversified its supply chain — Samsung is not the sole supplier for the MI350 series. However, securing a spot in AMD’s supply chain is a major achievement for the Korean company. It will not only help boost revenue but also play a key role in rebuilding trust with other clients.
Now that Samsung has secured a foothold in the HBM market, the company has an opportunity to show the competitiveness of its HBM solution in real-world AI applications. Korean media reports that Samsung might start supplying HBM3E to Nvidia as early as this month, which could improve its HBM business in the second half of the year.
Apart from the MI350 series, AMD has also confirmed that its 2026 MI400 series will use 432GB HBM4. While the company did not disclose its suppliers, industry watchers expect AMD to choose Samsung as the supplier for HBM4 chips. That won’t be surprising, though, given Samsung’s recent progress in the HBM4 developments.










