MU asserting their leadership. Interesting that Samsung, a formidable force in technology hasn't even validated their offerings. Left in the dust.
Amid the intensifying HBM race, Micron has secured a spot with its HBM3E 12H designed into NVIDIA’s GB300. Notably, according to its press release, the U.S. memory giant is also the only company shipping both HBM3E and SOCAMM memory for AI servers, reinforcing its leadership in low-power DDR for data centres
According to the Korean Herald, Micron has surprised the industry as it announced the mass production of SOCAMM—dubbed the “second HBM”—ahead of SK hynix
Baird hiked its price target from $130 to $163, signaling growing conviction that Micron's high-bandwidth memory (HBM) chips are about to play a much bigger role in the AI boom. That sentiment is spreading fast. Rosenblatt now sees the stock hitting $200, and Wedbush, UBS, and others are sticking with bullish calls. Why? Simple: Micron isn't just riding the AI wave it's building the surfboard. HBM sales topped $1 billion last quarter, beating internal forecasts and jumping 50% sequentially. More importantly, demand is sold out for the year, and the TAM forecast for 2025 just surged from $20B to $35B.
Internally Micron today say they meet 9% of the HBM market of $20B(Dec 24) and already appear to be on an annual TTM of $4B and say they anticipate their market share to reach 25% of a $100B market. This suggests Micron could grown HBM sales from zero 12 months ago to $25B annually by 2030 effectively doubling their total revenue.
This is a classic-look at the game being played out not the score board!
HBM isn’t just important—it’s foundational to AI servers, acting as the high-speed circulatory system for data-intensive AI workloads. For a chip like “Feynman” with “huge amounts” of HBM, it will be the backbone enabling breakthroughs in model size, training speed, and inference efficiency. As AI servers evolve, HBM’s role will shift from critical to utterly indispensable, driving both technical and economic outcomes in the AI race.
This is why we invested in MU-the HBM is the iPhone moment