-
As always appreciate your insights
-
Having had a few days to think about it, where's the future for SMCI? Their stock has continued to slide in the last couple of days although it's only down about 5% in the last month.
Does it have a future? Ether short- or long-term? If the chaps at Cobens looking at getting out then what's the horizon on this?
And what would anyone suggest I should do with the grand or so's worth of SMCI I hold outside of Cobens? Flog it? Flog half of it? Keep it for sh1ts and giggles? I'm in fairly good profit on it so can't complain.
-
Our weight changes regularly. On a fundamentals basis the stock is not over valued imo so I am not so much concerned about the multiple. I can't comment on anyones personal Extra-Portfolio holdings.
The ideas im working on at the moment factor. As mentioned it's all about relative opportunities.
-
Are these boys still in the Tech fund
-
just a smidge
-
Dragging up an old stock here, but SMCI aren't doing that shabbily of late; up 46% in the last month and 86% in the last six months. I know that I don't have the research or knowledge to comment on why they're doing OK and how they might do in the future but I'm glad I held onto my paltry 23 shares!
-
Hi O,
Yes it has. Speculation that they're shipping a lot of high end metal and clearly will ship a lot more given the wider market demands. There is only so much DLC capacity. We will see what they report and importantly what their margins are like.
There is no doubt that SM will grow revenue-no doubt at all but the billion dollar question is, can they improve margins and start fly wheeling earnings. If they gain traction on the services side, the 'total DC solution' end to end, design, build the bespoke racks, install cabling(lucrative) and manage the DC via a...“4-hour parts replacement SLA” “Hot spares / parts on-site service”. They manufacture the cooling towers, the power supplies and chassis via related parties so are nicely vertically integrated. All the ingredients for success. And without question SM tech is just better than the competition. Management needs to up its game.
The bottom line is we invested in the stock at $26.50 because we saw opportunity. We've done very well. Watching with interest
-
I follow Dylan Patel, founder of Semi Analysis. Very knowledgeable on the tech side, not the 'stock' side but for me, it compliments the knowledge and allows me to separate the hype from the reality . I know he met Chuck Liang recently to discuss their plans-he thinks something important is about to drop.
Why Supermicro (SMCI) Gets the Spotlight in Dylan’s Tease—And Not Dell, Foxconn, or Wiwynn
SM as a key supporter of his imminent “huge” framework on AI chips, inference, and infrastructure, due to drop by the evening of 9 October.SMCI is listed alongside hyperscalers/cloud players (CoreWeave, Nebius) and hardware/infrastructure specialists (Crusoe, HPE, Tensorwave), highlighting its pivotal role in the rack/server layer for optimised inference stacks. Notably absent are Dell, Foxconn, and Wiwynn (Wistron’s AI-focused ODM arm), despite their prominence in AI server markets. This isn’t arbitrary; it reflects SMCI’s unique position as the agile, high-density leader for the “neo-cloud” era,(think IREN) tailored to inference’s bursty(yes bursty-data that can burst from 1X to 100X in a nano second,) power-intensive demands.
Here’s why SMCI gets the call-out, grounded in Patel’s reports, posts, and industry context:1. SMCI’s Edge: Speed, Customisation, and Hyperscaler Fit for InferenceRapid Prototyping and Deployment: SMCI excels at “just-in-time” manufacturing, delivering 100,000+ servers in weeks rather than months. This is critical for inference, where hyperscalers like CoreWeave (an SMCI client) demand swift iterations on hybrid NVIDIA/AMD setups to manage variable query loads.
Patel’s September 2025 SemiAnalysis report on rack architecture (co-authored by him) praises SMCI’s modular designs for disaggregated PDUs and liquid cooling, enabling 250kW+ racks with a 30% better total cost of ownership (TCO) compared to rigid ODM builds. Foxconn and Wiwynn, as pure ODMs, prioritise volume for branded OEMs (e.g., Dell’s enterprise kits) but lag in bespoke hyperscaler customisation.
Direct Hyperscaler Relationships: SMCI sells directly to neo-clouds (CoreWeave, xAI’s Colossus—partially SMCI-supplied) and AI labs (OpenAI’s AMD pivot), bypassing intermediaries.In his August 2025 “No Priors” podcast, Patel highlights SMCI’s vertical integration (from motherboard design to cooling), giving them a two-year lead on liquid-cooled hybrids, essential for inference’s 80%+ energy draw. Dell shines in enterprise/sovereign AI (per Patel’s May 2025 “How Dell Is Beating Supermicro” report), but their slower cycles, optimised for HPC stability, don’t match neo-cloud urgency.
Inference-Specific Advantages: Backed by vLLM/SGLang (inference engines-sorry tech heavy!), the framework likely benchmarks rack-level metrics like tokens/second per watt. SMCI’s 8U/10U GPU trays (e.g., SYS-821GE with 8x Blackwell) blend NVIDIA prefill compute with AMD decode efficiency, reducing latency by 20-50%. Foxconn/Wiwynn handle high-volume NVIDIA HGX for cloud giants (e.g., Foxconn’s Oracle Stargate supply), but Patel’s critiques (e.g., 2023 posts on Foxconn’s public “reveals” being overhyped) underscore their ODM commoditisation—cheaper but less innovative for multi-vendor inference.
- Historical Context from Patel: SMCI as the “Crusher” in AI RacksPatel’s analyses consistently position SMCI as a leader in frontier AI infrastructure. A 12 September 2025 X post details his Supermicro factory tour with CEO Charles Liang, showcasing GB300/B300 and MI355X racks—directly tying to the tease’s NVIDIA/AMD backers. In contrast, his May 2024 report praises Dell for enterprise wins (e.g., Tesla, CoreWeave orders) but notes SMCI’s resurgence in neo-clouds via cheaper, denser cooling (e.g., April 2023 post: “I’m such an idiot for not going turbo long SuperMicro... they crush Dell and crew while being much cheaper”).
Final thought- SMCI’s Unique PositionDylan’s call-out of SMCI reflects their role as the inference infrastructure “backbone” for collaborative, multi-vendor stacks, validated by factory tours, reports, and backers like HPE (also listed). Dell, Foxconn, and Wiwynn play critical roles in enterprise or ODM volume but lack SMCI’s hyperscaler agility and hybrid rack innovation for inference. If the full drop (expected ~evening 9 October BST) includes rack BOMs or benchmarks, SMCI’s prominence will likely grow.
