Micron Technology
-
The place to discuss all things Micron-specifically around HBM-3 and 4
-
In the news today, Micron is looking for a construction partner to build phase 2 of its fabrication(fab) plant-bids have been received by KEC and Larsen/Toubro. Phase one will commence production any moment now. Micron contributed $825M and the Indian government $2.75B(nice deal)-plus a further $550M in tax incentives. How to build a factory and get others to pay for it.
Either way it's encouraging to see expansion in a critical accelerated compute component.
-
Today Micron announced it is set to begin mass production of its 12-stack HBM (memory) and will supply it to Nvidia. Their 12-stack solution is a generational improvement over their current 8-stack solutions allowing 24GB per layer for a total package of 288GB of memory per stack
HBM is all about bandwidth-how much data can the memory transfer to the CPU/GPU. A good analogy is a motorway , cars and lanes.
HBM-3e has a 1,024 bit interface-so think of a 1,024 lane motorway
8-stack is 8 stacks of motorway vs 12 stacks of motorway
Cars are the data
Speed limit-HBM3e vs HBM3. 3e is 25% faster and HBM4(2026) is expected to be 50% faster again.All told a 12-stack solution can move approx 60GB of data per second which is a vast speed and why it is so expensive.
-
I think your under selling HBM3e there, the DDR4 2400Mhz in my old workstation moves data at about 64GB/s, actually its a lot faster as its 8 channel, not quad channel.
According to Micron HBM3e bandwidth is 1.2 TB/s, now that's fast.
-
Micron appoints Mark Liu to their board- Mark spent 30 years at TSMC and was Chairman and Co-CEO. He is a titan in the industry and it's a big deal for Micron. He was instrumental in TSMC becoming the worlds best foundry. He resigned in 2024 and handed the reigns over to C.C Wei(at TSMC).He will bring deep expertise to the company. Micron is up 8% today
-
Micron’s 50% DRAM Outsourcing Boost Signals Major HBM Push
Micron Technology has ramped up its outsourcing of DRAM chip packaging and testing by 50% with Taiwan-based partner Powertech Technology, a move that underscores its aggressive pivot towards High Bandwidth Memory (HBM) production. This strategic shift, reported in early 2025, allows Micron to offload a significant portion of its DRAM workload, freeing up its own manufacturing lines to focus on the burgeoning HBM market—projected to soar from £3.1 billion in 2023 to over £19.5 billion by 2025.
The implications are clear: by contracting out this additional DRAM capacity, Micron is accelerating its HBM expansion at a critical time. HBM, prized for its high-speed performance in AI and high-performance computing applications, is in skyrocketing demand, and Micron aims to increase its market share from 10% to 20-25% by mid-2025. This outsourcing decision not only optimises its production capabilities but also signals strong confidence in HBM’s profitability, especially as the company’s latest quarterly net sales hit £6.8 billion, with HBM revenue doubling.
Industry analysts see this as a savvy play. With a new £5.5 billion HBM-focused facility in Singapore set to launch in 2026, and plans for HBM4 and HBM4E rollouts in 2026 and 2027/2028 respectively, Micron is positioning itself to challenge South Korean leaders SK Hynix and Samsung. The freed-up capacity could help Micron meet commitments like supplying HBM3E 8H memory for Nvidia’s Blackwell architecture, showcased at CES 2025 with a blistering 1.8 terabytes per second bandwidth. Moreover, with £4.8 billion in US CHIPS Act funding secured in December 2024, Micron’s domestic and global manufacturing footprint is expanding rapidly.
This 50% DRAM outsourcing hike is more than a logistical tweak—it’s a bold statement of intent. By prioritising HBM over traditional DRAM in its own facilities, Micron is betting big on the AI-driven future, potentially reshaping the competitive landscape of the memory industry.
Source: TechPulse -
Last night Micron (MU) reported their fiscal Q2/2025 earnings.
Revenue $8.05B, +38.3% Y/Y! (beat by $150M), and EPPS $1.56 (beat by 14c).
CEO Sanjay Mehotra (a brilliant engineer) said 'Micron delivered above guidance and data centre revenue is up 300% from 1 year ago. We are extending out technological lead with our 1-gamma DRAM nodes. We expect record quarterly revenue in Q3'
The outlook is a guide of 8.8B
Comments from the conference call:
HBM memory grew +50% sequentially to over $1B this quarter. HBM shipments were ahead of our plans and we are the only company globally who has shipped low power DRAM into the data centre in high volumes. Momentum is building. We expect Q3 to be another record.
Our 1-beta DRAM leads the industry and we are extending that leadership further with the launch of our 1-gamma node.
In January, we broke ground on an HBM advanced packaging facility in Singapore and we are aiming to production at scale in 2027.
As GPU and AI accelerators performance continues to increase, these high performance processors are starving of memory bandwidth. HBM is the bandwidth needed and we are very excited about the growth opportunities. It is a highly complex and highly valuable product category where out customers recognise Micron as the HBM technology leader.
All 2025 HBM production has been pre-sold and we are already signing agreements for our planned 2026 supply.
Most of our H2 2025(second half) shipments will comprise our new '12-high' HBM3E (blackwell ultra!) Looking ahead we are enthusiastic about of HBM4 for candour 2026 which is aligned to our customer requirements (Rubin)
Our NAND SSD 9550 is approved for NV72 GB200
We see promise in the automotive sector-memory and storage content in cars continues to increase as AI enabled in-car infotainment systems become more enriched. Advanced robotaxi platforms today contain over 200 GB of DRAM
-
MU asserting their leadership. Interesting that Samsung, a formidable force in technology hasn't even validated their offerings. Left in the dust.
Amid the intensifying HBM race, Micron has secured a spot with its HBM3E 12H designed into NVIDIA’s GB300. Notably, according to its press release, the U.S. memory giant is also the only company shipping both HBM3E and SOCAMM memory for AI servers, reinforcing its leadership in low-power DDR for data centresAccording to the Korean Herald, Micron has surprised the industry as it announced the mass production of SOCAMM—dubbed the “second HBM”—ahead of SK hynix
Baird hiked its price target from $130 to $163, signaling growing conviction that Micron's high-bandwidth memory (HBM) chips are about to play a much bigger role in the AI boom. That sentiment is spreading fast. Rosenblatt now sees the stock hitting $200, and Wedbush, UBS, and others are sticking with bullish calls. Why? Simple: Micron isn't just riding the AI wave it's building the surfboard. HBM sales topped $1 billion last quarter, beating internal forecasts and jumping 50% sequentially. More importantly, demand is sold out for the year, and the TAM forecast for 2025 just surged from $20B to $35B.
Internally Micron today say they meet 9% of the HBM market of $20B(Dec 24) and already appear to be on an annual TTM of $4B and say they anticipate their market share to reach 25% of a $100B market. This suggests Micron could grown HBM sales from zero 12 months ago to $25B annually by 2030 effectively doubling their total revenue.
This is a classic-look at the game being played out not the score board!
HBM isn’t just important—it’s foundational to AI servers, acting as the high-speed circulatory system for data-intensive AI workloads. For a chip like “Feynman” with “huge amounts” of HBM, it will be the backbone enabling breakthroughs in model size, training speed, and inference efficiency. As AI servers evolve, HBM’s role will shift from critical to utterly indispensable, driving both technical and economic outcomes in the AI race.
This is why we invested in MU-the HBM is the iPhone moment