The growing artificial intelligence revolution is powerfully changing the face of the memory market, with High Bandwidth Memory (HBM) expected to lead the charge in dramatic fashion by 2025.
Recent news reports and market analyses confirm just such a tectonic shift in demand, stemming directly from the ongoing hunger for more memory bandwidth in AI workloads and high-performance computing (HPC) applications.
As per the Q1 2025 Storage Report: The cycle of expanding AI is fueling a 62% year-over-year growth in the server and storage component market during Q1 2025. This demand is driven largely by strong engagement for AI accelerators, and crucially, HBM. According to intelligence provider TrendForce, HBM shipments are projected to increase a whopping 70% YoY, shaking the DRAM industry to its roots as production focuses increasingly on HBM.
Major players such as SK Hynix, Samsung, and Micron are making large investments in HBM manufacturing, and by the end of the decade, HBM’s share of the overall DRAM market is anticipated to be as high as 50%.”
Such a speedy conversion is rooted in the benefits that HBM naturally brings, that due to the 3D stacked configuration offer an unrivaled bandwidth and energy efficiency required to support complex AI models and real-time streaming data processing. For example, Micron has sampled 12-layer stacked 36GB HBM4 to major customers and expects mass production to support next-generation AI platforms to start by 2026.
While classical DRAM is also advancing, the memory wall – performed limited by memory bandwidth – is an increasingly important bottleneck for AI algorithms. This has, not surprisingly, resulted in a significant reallocation of capex towards DRAM (including HBM), possibly raising supply stress in the NAND space as capex shifts out of second gear.
HBM’s focus underscores its strategic role in driving the future of AI infrastructure; helping to fuel faster innovation and more efficient AI-powered applications across industries including data centers, autonomous vehicles and more.