Memory semiconductor suppliers offering high bandwidth memory (HBM) chips for AI processors are seeing an unprecedented surge in sales, offsetting demand and inventory correction problems elsewhere.
It is rare for any segment of the semiconductor industry to be as sure of multi-year sales growth forecasts as memory vendors supplying products to the artificial intelligence (AI) markets are today. High demand for high-bandwidth memory (HBM) from AI accelerator developers like Nvidia Corp. developed on time to rescue vendors in the segment from the severely depressed sales they had racked up only a few years ago. SK Hynix is a good example of the changing dynamics in the memory sector. Revenue at the Korean company more than doubled in 2024, 102 percent, to KRW66.2 trillion ($46.1 billion), from KRW32.8 trillion in 2023. Two years earlier, SK Hynix had tumbled into a severe sales slump, its revenue tumbling 38 percent, from the record KRW44.6 trillion won it had reported in 2022.
Not only have SK Hynix and competitor Micron Technology Inc. recovered. They are blooming with, surprisingly, hefty sales increases. Analysts on average expect SK Hynix to report sales of KRW82.9 trillion, in 2025, rising 25 percent. The trend of revenue growth is forecast to continue in 2026 and beyond for several years, according to observers. “SK Hynix forecasts that the demand of HBM and high-density server DRAM, which is essential in high performance computing, will continue to increase as the global big tech companies’ investment in AI servers grow and AI inference technology gains importance,” the company said, in a statement.
Researchers expect enterprises and governments to continue investing huge sums in AI programs over the next decade. To meet the demand, SK Hynix said it will “push for the transition to advanced process, necessary for the production of competitive DDR5 and LPDDR5.” The new investments should pay off almost immediately due to strengthening demand. “SK Hynix has built fundamental to achieve sustainable revenues and profits even in times of market correction.” said Kim Woohyun, CFO at SK Hynix, in a statement announcing the company’s latest quarterly results. “While maintaining the profitability-first commitment, the company will make flexible investment decisions in line with market situation.”
Micron is in the same position, said president and CEO Sanjay Mehrotra. During the recent memory market slump, Micron’s revenue fell by almost half, sinking to $15.5 billion, in the fiscal year ended August 31, 2023, from $30.5 billion, in the prior fiscal year. Its recovery has since shifted into high gear with revenue climbing to $25.1 billion, in fiscal 2024. In the current fiscal year, analysts’ revenue estimate for Micron is $35 billion, up 39 percent. The improvements are expected to continue in fiscal 2026, when sales are projected to increase to $44.4 billion, up 27 percent. Again, AI demand for HBM is powering the growth despite challenges in other market sectors due to inventory excess. “AI agents will become ever more capable and address vertical market consumer and enterprise use cases, driving accelerating monetization of AI,” Mehrotra said, in a presentation. “We have upgraded our view of server unit percentage growth and now expect it to reach low teens in calendar 2024, fueled by strong AI demand as well as a robust traditional server refresh cycle.”
A perfect union of AI and HBM
HBMs are not easily manufactured. Vendors have faced yield and productivity problems and struggle with the tough engineering technologies involved in making HBMs, according to industry observers. One other challenge faced by suppliers comes from the fact that HBM requires advanced 3D packaging, “including the formation of interconnect pillars and through-silicon vias (TSVs) on both the front and back side of the wafer,” according to experts at Applied Materials. “Besides TSVs, micro-bump pillars are critical to the electrical and thermal performance of HBM stacks,” the company said. “While there has been significant innovation in reducing both the dimension and pitch of micro-bumps, contact resistance grows exponentially as these bumps and their corresponding bond pads shrink in size.”
These challenges have resulted in the scarcity of HBMs, leading to further delays in the production of accelerators at companies like Nvidia. Samsung Electronics is struggling to meet the rising demand for memory products. Samsung Electronics, an early developer of HBM, hit a snag in its production of HBMs. The company is bouncing back, though, according to customers. During the Consumer Electronics Show (CES) in January, for example, Jensen Huang, CEO of Nvidia, assured the audience that its partnership with Samsung for the supply of HBMs was on track despite the initial challenges. Nvidia is compelled to work with as many suppliers as possible to assure uninterrupted supplies of HBM, observers said, since the company’s current needs are not being met, negatively impacting its ability to supply servers to its own customers. Micron has limited extra production capacity in place, according to company executives. “Our HBM is sold out for calendar 2025, with pricing already determined for this period,” Mehrotra said.
Projections are for HBM revenue growth to quadruple by the end of this decade. Estimates from analysts show the market reached $16 billion at the end of last year. As demand continues to rise, they expect the market to exceed $100 billion by 2030. This positive forecast is pushing Samsung to overcome the production problems it has faced, they said. For companies like Micron and SK Hynix, however, the strong multi-year projected growth in demand for HBM is good news, one that will help erase the tough memories of the last DRAM market downturn. “Our TAM forecast for HBM in 2030 would be bigger than the size of the entire DRAM industry, including HBM, in calendar 2024,” said Micron’s Mehrotra. “This HBM growth will be transformational for Micron. Leading-edge DRAM supply remains tight, driven by robust demand in data center DRAM, including HBM.”