Nvidia News
-
Colette Kress absolutely smashed it at the UBS conference yesterday, proper bullish vibes everywhere! She told the room to forget any “AI bubble” nonsense – this is a multi-trillion-dollar revolution that’s only just kicking off. She’s calling for $3–4 trillion of data-centre spend by the end of the decade, with half of it flowing straight into NVIDIA’s GPUs. Insane numbers.
Demand is 'overwelming' (the only way to describe it): $500 billion already booked for Blackwell and Rubin chips through 2026, and every single chip they finish ships instantly. Inventory is exploding because they literally cannot build them fast enough. Then layer on the monster OpenAI deal (up to $100 billion still in play) and Anthropic’s billions waiting in the wings – the growth runway is endless.
Competition? Forget about it. Blackwell is flying out the door, Rubin’s already taped out and lands in 2026 with another massive leap, and NVIDIA’s full-stack systems are light-years ahead. Margins staying fat in the mid-70s, inference now a proper money-spinner, and the whole AI flywheel is accelerating like mad.
Honestly, she made NVIDIA look unstoppable. And I would add, she chooses her words carefully-always precise and not one to get carried away, so her comments are very pleasing.
-
Developing news- US Govt in talks to approve mid tier Nvidia silicon into China. Stock is moving up. Watch this space.
-
It’s up again after hours to 190. Save the kittens

-
Trump green lights H200 to select/approved customers in China. The Govt taking a 25% share of any sales. It's a net positive given zero previous sales. Will the CCP allow it. It will be hard to resist given it is by far the most powerful chip available in China. My take being anything is better than nothing. I would expect 500k chips give or take equating to $30B over the next 12 months.
-
The assertion that China will refuse to purchase NVIDIA’s H200 chips appears to be FUD imo.
Economic imperatives and technological realities strongly suggest otherwise.Recent policy shifts under the Trump administration have relaxed restrictions, permitting H200 exports to pre-approved Chinese customers subject to a 25% surcharge earmarked for United States purposes (Reuters).For hyperscalers such as Tencent, ByteDance, and Alibaba, which have been constrained by the limited performance of the sanctioned H20 or by reliance on smuggled inventory, the H200’s 989 BF16 TFLOPS represent a roughly sevenfold leap over the H20’s 148 TFLOPS.Domestic alternatives, principally Huawei’s Ascend 910C, remain markedly inferior. Independent benchmarks place the 910C at approximately 76% of H200 performance per chip (15,840 TPP vs 12,032 TPP) and significantly lower efficiency at cluster level, with power consumption up to 2.3 times higher for equivalent throughput.
Production is further hampered by SMIC’s 7 nm yields, reportedly below 30%, creating chronic supply shortages. Beijing’s reported requirements—mandatory justification for foreign purchases and restrictions on public-sector use—are largely procedural and political signalling rather than outright prohibition.
President Xi’s recent “positive” remarks on United States-China technology co-operation reinforce pragmatic acceptance. In the absence of a credible domestic substitute capable of training frontier-scale models, Chinese enterprises are expected to acquire H200s in volume, potentially generating 10s of billions USD in additional revenue for NVIDIA in 2026–2027.
-
Reports, the first H200 chips from inventory will ship(China) in February with new capacity soon entering production. The Feb $ circa 3 billion. Every little helps

-
Reported by Reuters today
The sudden reopening of the Chinese market for Nvidia's H200 AI chips represents an explosive catalyst for the company's revenue trajectory in 2026 and beyond.
Chinese tech giants have already placed orders for more than 2 million H200 units targeted for 2026 delivery, far outstripping Nvidia's current 700,000-unit inventory. Priced at approximately $27,000 per chip (with variations by volume and including both standalone H200 GPUs and GH200 superchips), fulfilling this backlog could deliver over $54 billion in gross revenue—before the U.S. government's 25% export fee and final adjustments.
Bring it on!
-
Is this old news from Nvidia?
Vera Rubin is in full production.We just kicked off the next generation of AI infrastructure with the NVIDIA Rubin platform, bringing together six new chips to deliver one AI supercomputer built for AI at scale.
-
Happy NY, C!
If you recall we discussed Micron's HBM4 being in Dec production(early) which lead us to speculate that Rubin would be early. So yes, it's confirmation that Nvidia are ahead of schedule-excellent news. The flow on effects will be significant. Given the packaging utilises CoWoS-L (the same process as Blackwell), the transition should be seamless due to them using the 3NM process which is mature. AMD are going all in on 2NM but from what ive heard(as expected) yields are still poor and this will likely cause them problems in 2026. I don't think they had any choice as they can't compete with their current systems (not even against Hopper).
The wind is very much at Nvidia's back(as expected)
-
NVIDIA's CES 2026 announcements from Jensen Huang's keynote.
NVIDIA unveiled the Vera Rubin platform (successor to Blackwell) as its first extreme co-designed, six-chip AI supercomputer, now in full production. This includes the Vera CPU, Rubin GPU, NVLink 6 switch, ConnectX-9 SuperNIC, BlueField-4 DPU, and Spectrum-6 Ethernet switch—all redesigned simultaneously for rack-scale efficiency. Those in the know are calling this 'insane' (the complexity all at once)
Key confirmed details:The Rubin GPU delivers up to 5x greater inference performance (and 10x lower cost per token for some workloads) than Blackwell, with architectural gains (e.g., HBM4 memory providing ~2.8x bandwidth) far exceeding traditional Moore's Law increments (25% per generation).
Transistor count is around 1.6x higher (e.g., ~336 billion vs Blackwell's ~208 billion in comparable configs), enabling the outsized leap through co-design.
The Vera Rubin NVL72 rack offers massive scale-up bandwidth (~260 TB/s (260 Terrabytes/second!)via 6th-gen NVLink), often compared to exceeding global internet traffic in context. Expected price per rack, circa $8M-$10M each. Expected volume in 2026 as they ramp, 7k units and up to 30k units in 2027
Advanced liquid cooling supports hot water (up to 45°C), boosting efficiency and reducing data centre power consumption.
Alpamayo is the world's first open-source reasoning family of models (including chain-of-thought vision-language-action) for autonomous vehicles, with full code, datasets, and tools (e.g., AlpaSim) released on Hugging Face/GitHub.
The all-new Mercedes-Benz CLA will be the first production vehicle with NVIDIA's full AV stack (including Alpamayo for explainable reasoning, e.g., narrating decisions like braking for a truck/cyclist). Launch: US in Q1 2026, Europe Q2, Asia later in the year. Huang demoed it hands-free in San Francisco traffic.
Strategy: Open models accelerate ecosystem adoption (like DeepSeek's impact), driving demand for NVIDIA hardware ("give away the recipes, sell the kitchen"). Partnerships include Siemens (industrial tools), Cadence/Synopsys (chip design), Palantir/ServiceNow/Snowflake (platforms), and AV players like Lucid/JLR/Uber.
NVIDIA positions itself ~18 months ahead, with competitors chasing Blackwell while Rubin ramps.
-
Nvidia’s CEO said (7 Jan) demand for Nvidia products is high across the board, and "I'm fully expecting a really giant year for our business with TSMC," media report, which makes most of Nvidia’s chips
Nvidia CEO Jensen Huang said(7 Jan) H200 AI chip demand from China clients is “very high” and “we’ve fired up our supply chain, and H200s are flowing through the line,”, adding he said the signal for Beijing’s approval will be purchase orders. “If the purchase orders come, it’s because they’re able to place purchase orders,” he said.
None of this is factored into the stock. Let's see how this pans out but the takeaway, as always is listen to what the man himself says not the FUD-he has a great track record of being the only credible source of the facts.
A reminder- unlike most companies Nvidia quarter end is Jan 31 (not Dec 31)
-
News just reported by Bloomberg:
China approves H200 imports
China is moving towards reopening of its market to Nvidia’s H200 AI chips, allowing imports for selected commercial uses while restricting access for the military, sensitive government bodies, critical infrastructure, and state-owned enterprises. The policy mirrors China’s broader approach of balancing foreign technology access with national security.
The H200 is an older-generation chip cleared for export under US rules, with shipments approved in December under a 25% surcharge. Demand is strong, with Alibaba and ByteDance each reportedly interested in orders exceeding 200,000 units.NB-the 25% surcharge, to be passed on to the US Govt is believed to be a premium paid by the purchaser, not borne by Nvidia
Imo Demand is closer to 2M chips-Nvidia have 700k on hand and have asked TSM to restart production. Monetisation should start in February which is Q1 of the next fiscal year-quantum in the order of several billion which can only help the guide.
-
Nvidia’s rise in pre-market trading, driven by a mix of fresh headlines and ongoing sentiment.
The main focus is China. Reports indicate that Jensen Huang is visiting China, which has reignited speculation around Nvidia’s ability to maintain or expand its presence there despite export restrictions.
China remains a huge end-market for AI and data-centre chips, and even modest signs of regulatory flexibility or creative workarounds tend to lift investor confidence. Nothing concrete has changed yet, but markets are reacting to the possibility of improved access or stabilised demand rather than confirmed policy shifts.
Another contributor is talk around pricing power. There are reports circulating that Nvidia has been able to raise prices on certain GPUs and related components due to overwhelming demand for AI compute. While these claims are not formally confirmed by the company, they reinforce the prevailing narrative that Nvidia is operating from a position of strength, with customers willing to pay up for scarce, high-performance hardware. For investors, this translates into expectations of stronger margins and resilient revenue growth.
Momentum also plays a role. Nvidia has been a major driver of recent gains across US equity indices, and strong prior performance often spills into pre-market trading as traders position ahead of the open. Pre-market volumes are relatively thin, so even modest buying can exaggerate price moves.
In short, the stock is up on sentiment, speculation, and momentum — encouraging signals, but not a fundamental game-changer on their own.
-
It's coming. Rumoured to be an initial $18 Billion-should be a nice addition to Q1 (Feb through April 26).

-
Semiconductor news just in:
Orders for AI servers have outstripped supply at Foxconn, the world’s biggest AI server maker, media report, adding it’s working overtime to fulfill Nvidia GB300-based server orders now. Foxconn is boosting automation to raise output. Trial production of Vera Rubin-based servers has begun.
-
China has approved the first official imports of Nvidia’s H200 AI chips, reversing an earlier de-facto blockade at customs. That’s a big shift — Beijing had been blocking physical shipments even after the U.S. gave export permission.
The approval covers hundreds of thousands of chips, with ByteDance, Alibaba and Tencent cleared to buy a combined ~400,000+ H200 units.
Other Chinese companies are reportedly waiting in line for subsequent approvals — but the total number that will get licensed is still unknown.
The move came during a visit by Nvidia CEO Jensen Huang to China, underscoring the political timing.
This isn’t just a supply issue — it’s a policy shift in how China deals with foreign high-end chips.It's speculated that Nvidia have 700k H200 in hand and I would expect these orders to be booked in Q1 so theoretically they should be included in the Q1 guide, reported along side their Q4 earnings next month(which end 30 January). 400k chips would equate to around $13-$14B + memory and ancillary components+ tariff(25%)
-
Big numbers. Thanks Adam.
The thing that slightly bothers me with Nvidia is that they seem to be too good to be true. And we all know the phrase that applies when that's the case. They are sitting pretty having created the biggest ticket in the fattest technological revolution since the internet, in the juiciest industrial advance since the industrial revolution, and have a stranglehold on the good stuff so tight the established players can't even see which way they are going. Every man and his dog who thinks they want a slice of the AI pie are queuing around the block to buy their chips and they are selling things now that won't exist for another year, given the demand.
What are the chinks in their armour? What could go wrong? What are the risks? Huang is a seriously clever guy, and his brains are matched with a conservative integrity that is most appealing in business - what happens if he slips and falls under a big red bus? What if China does invade Taiwan? What if someone magics up some new technology (Organic Computing? Spintronics?) which stands the world on it's ear and makes everything we currently use redundant? What if aliens invade and .... essentially, what could go wrong?
I hear and understand the argument that says that if China invades Taiwan then we probably will have bigger things to worry about that the value of share portfolios, but the general question stands; where could it go wrong?
-
Some fair points.
My concern is more around I how much of the AI growth is actually vendors just moving things around.
The sort of picture drawn hereI don’t think it is just that…but I still feel we have road bumps ahead.