Samsung Could Earn Billions Supplying HBM4 Chips for Google’s TPU
Samsung is recognized as one of the world’s leading manufacturers of semiconductor memory chips. Its latest and most advanced offering is the sixth-generation high-bandwidth memory, known as HBM4. These cutting-edge memory chips are set to be used in some of the fastest AI accelerators globally. Notably, Samsung’s HBM4 chips will be incorporated into Google’s most powerful Tensor Processing Unit (TPU), scheduled for release next year.
According to a report from South Korea, Samsung’s HBM4 memory chips have successfully passed Google’s rigorous qualification tests. These chips will be integrated into Google’s seventh-generation TPU, which carries the codename Ironwood. The Ironwood TPUs will be produced by Broadcom. Recently, Broadcom’s Chairman and CEO, Hock Tan, visited South Korea to meet with Yon Young-hyun, the head of Samsung Device Solutions. During this visit, the two parties reportedly signed a supply agreement for HBM4 chips.
Details of Samsung’s HBM4 Supply Deal and Future Prospects
Industry insiders reveal that Hock Tan discussed the terms of the HBM4 chip supply with Samsung. He requested a supply agreement extending through 2028. However, Samsung has currently only confirmed the supply of HBM4 chips through 2026. The company has agreed to revisit and negotiate further supply terms at a later date. In addition to HBM4, Samsung will continue to supply more fifth-generation HBM chips, known as HBM3E, for Google’s current sixth-generation TPU.
Overall, Samsung is expected to supply three times the amount of HBM chips for Google’s TPU compared to previous agreements. Approximately half of Samsung’s HBM4 chips will be delivered to Broadcom, while the remaining half will be supplied to Nvidia for its Rubin GPU. Samsung has already sent its final HBM4 samples to Nvidia for quality approval and is anticipated to receive approval later this month.
Samsung’s Growing Role in the AI Memory Chip Market
Despite losing out to Micron and SK Hynix in supplying HBM3E chips to Nvidia last year, Samsung has made significant improvements. The company has enhanced the performance of its HBM chips and increased production capacity. These developments have positioned Samsung very well in the market for AI data centers and servers.
With these advancements and new supply agreements, Samsung could earn billions by supplying its advanced memory chips for AI applications. The company’s role in powering AI accelerators like Google’s TPU and Nvidia’s GPUs highlights its growing influence in the semiconductor memory industry. Samsung’s HBM4 chips, in particular, are expected to play a crucial role in the next generation of AI hardware, potentially generating substantial revenue for the company in the coming years.
For more stories on this topic, visit our category page.
Source: original article.
