SINGAPORE/SEOUL (Reuters) -A version of Samsung Electronics’ fifth-generation high bandwidth memory (HBM) chips, or HBM3E, has passed Nvidia’s tests for use in its artificial intelligence (AI) processors, three sources briefed on the results said. #SamsungHBM3Echips
The qualification clears a major hurdle for the world’s biggest memory chipmaker. Samsung has been struggling to catch up with local rival SK Hynix in the race to supply advanced memory chips capable of handling generative AI work.
Samsung and Nvidia have yet to sign a supply deal for the approved eight-layer HBM3E chips. Sources expect supplies to start by the fourth quarter of 2024.
The South Korean technology giant’s 12-layer version of HBM3E chips has yet to pass Nvidia’s tests. The sources declined to be identified as the matter remains confidential.
Both Samsung and Nvidia declined to comment.
HBM is a type of dynamic random access memory first produced in 2013. Chips are vertically stacked to save space and reduce power consumption. As a key component of GPUs for AI, it helps process massive amounts of data from complex applications.
Samsung has been seeking to pass Nvidia’s tests for HBM3E and fourth-generation HBM3 models since last year. However, it has struggled due to heat and power consumption issues, Reuters reported in May, citing sources.
The company has since reworked its HBM3E design to address those issues, according to the sources who were briefed on the matter. #SamsungHBM3Echips
Samsung said after the publication of the Reuters article in May that claims its chips had failed Nvidia’s tests due to heat and power consumption problems were untrue.
“Samsung is still playing catch up in HBM,” said Dylan Patel, founder of semiconductor research group SemiAnalysis.