Micron Technology has commenced mass production of its high-bandwidth memory semiconductors for use in Nvidia’s latest artificial intelligence chip, leading to a four per cent increase in its shares before the bell on Monday, Reuters reported.
The HBM3E (High Bandwidth Memory 3E) is expected to consume 30 per cent less power than rival offerings, and could meet the growing demand for chips that power generative AI applications.
Nvidia plans to incorporate the chip into its next-generation H200 graphic processing units, which are set to begin shipping in Q2 and surpass the current H100 chip that has driven a significant increase in revenue for the chip designer.
The demand for high-bandwidth memory (HBM) chips, primarily supplied by Nvidia’s supplier SK Hynix, for use in AI has also raised investor expectations for Micron’s ability to withstand a slow recovery in its other markets.
HBM is one of Micron’s most profitable products, partly due to the technical complexity involved in its construction. The company had previously projected “several hundred million” dollars of HBM revenue in fiscal 2024 and anticipated continued growth in 2025.