General News
-
From what ive read it's a big leap forward in ability. All big name CEOs have commented on it
Now, the lab versions OpenAI is running with all this new funding are probably on a whole different levelāthink of them more like an IQ of 200 vs GPT 5 IQ 140. They can handle huge, multi-step problems that require combining logic, memory, maths, language, vision, even code.
They could plan projects, predict complex systems, design experiments, or coordinate multiple agents at the same time.
The usefulness comes down to real-world problem solving. Where public GPTā5 is excellent for everyday tasksāwriting an essay, helping with code, summarising documentsāthe lab models might: help scientists design new drugs, optimise supply chains globally, simulate economic or climate scenarios, or even run advanced robotics tasks.
In short: public GPTā5 is smart, but the lab versions are the ones likely showing frontier-level reasoning, memory, and creative problem-solvingāthe kind of AI that could tackle tasks humans find extremely challenging or slow.
I believe the following is a realistic example based on reliable sources I read:
Drug discovery and design.
With public GPTā5, you could ask it to summarise research papers on a disease, suggest plausible molecular targets, or draft a report on clinical trial data. Itās helpful, but a human scientist still has to do the heavy lifting: designing molecules, simulating their behaviour, and predicting side effects.
Now imagine the lab model: it could ingest millions of molecular structures, biochemical pathways, patient datasets, and research papers simultaneously, then design entirely new compounds, simulate their interactions, predict toxicity, and optimise for effectivenessāall in a fraction of the time a team of experts would take. It could even propose multiple variations, rank them by likelihood of success, and adapt its suggestions based on real-world lab results.
The difference is like going from a super-intelligent research assistant to an autonomous research team that can plan, iterate, and predict outcomes across disciplines. Public GPTā5 gives you ideas; the lab model starts doing the actual work, making discoveries that would otherwise take years.Anthropics CEO described the leap as going from working with a good PHD student to working with a country of Nobel prize winners. He means working with genius level AI agents all working on the same task (millions of them all working independently).
From what ive been reading and hearing GPT 3, 4,5 increments in smarts which we have seen publicly Vs The lab version is a leap from 5 to 10!
I think we will see something very impressive in the next 6 months. The funding is to scale out the compute so OpenAI can prepare for the huge influx in enterprise use. And it would appear as though Amazon just got the contract to cloud serve the bandwidth to do it.
-
A quick update.
Taking yesterday and real time today, net growth is down approx 1.25%.
Regardless we feel this is a temporary blip which has no bearing on the health of the companies we invest in nor will it. It's nothing more than the ones which have risen the fastest 'short-duration stocks', getting pulled back more than others, regardless of their quality (a few exceptions).
There is plenty of company specific positive news coming out (Micron for example) which shows clear evidence of their business getting strong whilst their stock gets cheaper.
-
Realitime growth +2%. This is no indication it will remain same in 5 mins

-
We have spoken about this for a while. It's one thing having ambitions to compete. In a constrained market, what matters is having the components(supply chain). Most do not.

-
Nvidia & Texas Instruments collaborate to build physical AI (humanoid robots).
The collaboration combines Texas Instrumentsā real-time motor control, precision sensing, radar and power-management technology with Nvidiaā advanced robotics compute platforms and simulation software to help developers design, test and deploy humanoid robots faster and more safely.
At the hardware level, TIās mmWave radar sensors are being integrated with Nvidiaās Jetson Thor platform using Nvidiaās Holoscan Sensor Bridge. That matters because radar adds a layer of perception that cameras alone canāt provide. Cameras struggle in fog, glare, low light or when detecting transparent or reflective surfaces like glass. Radar works in those conditions and provides precise distance and velocity data in real time.
The broader goal is improved 3D perception, sensor fusion and low-latency decision-making. For humanoid robots, that means better balance, safer human interaction, more reliable obstacle avoidance and tighter motor control. It also addresses a core robotics bottleneck: synchronising high-performance AI inference with deterministic, real-time physical control systems. -
Headline CPI(US) holds at +2.4% Y/Y in February, as expected.
Solid gains today as a result
-
Breaking- US/Iran in talks to end hostilities.
Futures swung 700 points to +500
Oil plunges to $90 -
Feels like TACO Donnie is making up the talks so he can save face over his ludicrous invasion (see many sources!)ā¦.but better that than the āgreat peace Presidentā succeeding in starting WWIII

Still, it kept the Trump-Epstein
Files off the front pages for another week or two
ā
ļø -
Hi Mike, it's fine to call into question the veracity of the US Presidents comments but we try and keep this place apolitical for one simple reason. It's a polarising subject. Thank you.
