Monday 24th November 2025
The real winners of the AI revolution are hiding in plain sight
Don't miss the most compelling story in the AI world. A hint: it might not be front-and-centre in the headlines or the stratospheric share price rises.
Three years after the world’s first public encounter with ChatGPT, generative artificial intelligence has moved from curiosity to catalyst, reshaping industries, energising capital markets, and rewriting the playbook for corporate innovation. Yet amid the frenzy of new models and viral applications, investors may be missing the most compelling story: the quiet, powerful ascent of the companies that make AI possible.
These are not the headline-grabbing names building chatbots or digital assistants. They are the builders of the digital backbone — the infrastructure software firms and semiconductor developers providing the raw compute, data and connectivity needed to train and run large language models. In today’s market, we believe they represent compelling investment opportunities for long-term investors.
A once-in-a-decade capex cycle
The scale of investment underpinning this transformation is extraordinary. In 2025, the world’s four major cloud hyperscalers — Microsoft, Amazon, Alphabet and Meta — are expected to pour US$378 billion into capital expenditure, a 65% increase over 2024. This spending surge is not debt-fuelled exuberance; it’s financed largely from the companies’ own cash flows. That distinction matters. It signals that these firms view Gen AI not as a passing trend, but as a secular evolution in computing power and capability.
This wave of investment extends well beyond the “big four.” Oracle, CoreWeave, and other emerging platforms are ramping up their AI infrastructure, chasing the enormous demand for compute power. As a result, we are witnessing what we consider as a once-in-a-decade capex supercycle — one that could sustain growth for the ecosystem of technology infrastructure providers for years to come.
Infrastructure software: The silent growth engine
Cloud infrastructure software sits at the heart of this transformation. The companies providing the core computing, storage, and networking capabilities that power AI models are seeing renewed growth. Many enterprises learned the hard way that building in-house AI systems is expensive and unreliable. They are now turning to specialised, battle-tested platforms that can manage the scale and complexity of AI workloads securely and efficiently.
Oracle’s evolution is emblematic. Once considered a legacy software player, it has redefined itself as a credible fourth hyperscaler. Its cloud infrastructure now supports projects from OpenAI, xAi, and Meta, and powers the U.S. government’s Stargate AI initiative. Its partnership with Nvidia, and access to vast GPU inventories, has made it an indispensable player in training large-scale models.
Data infrastructure firms such as Snowflake and Databricks are equally critical. Quality data — accessible, secure and well-structured — is the lifeblood of AI. These companies help enterprises unify and harness their data for large-scale analytics and model training, enabling the insights and automation that underpin AI-driven growth.
Meanwhile, demand for monitoring and observability software — tools that help companies oversee and secure increasingly complex systems — continues to expand. As AI workloads grow, visibility and reliability become non-negotiable. Firms like Datadog and Dynatrace are positioned to benefit from this structural shift, which we believe remains underappreciated by markets.
Custom silicon: powering the next generation of AI
The explosion of AI has also redrawn the semiconductor landscape. Traditional processors are no longer enough to handle the scale of training and inference workloads. The result? A renaissance in custom silicon with chips designed specifically for AI.
While Nvidia continues to dominate with its GPU ecosystem, companies like Broadcom and Marvell Technology are seizing share by designing application-specific integrated circuits (ASICs) tailored to hyperscalers’ needs. Broadcom’s partnerships with Google, Meta, and OpenAI exemplify a collaborative model that allows hyperscalers to build differentiated systems without reinventing the wheel.
Networking, too, has emerged as a new frontier. As AI models become more distributed, the ability to move data quickly across thousands of processors is becoming just as important as raw compute power. Marvell’s high-speed connectivity solutions, from ethernet adapters to optical interconnects, addresses this critical bottleneck and positions it as a vital enabler of the next generation of AI supercomputing.
Despite its ubiquity in headlines, generative AI is still in its infancy as an investment theme. The eventual winners and losers across the tech ecosystem are yet to be decided. But one dynamic is already clear: sustained capital expenditure from hyperscalers is creating powerful tailwinds for the companies supplying the tools, chips, and software that make AI run.
These infrastructure enablers may lack the immediate glamour of the consumer-facing AI brands, but they possess – durable growth, strong cash flow visibility, and a front-row seat to the most important computing revolution in decades.
As investors, our role is to look past the noise and focus on where sustainable value creation will occur. In the Gen AI era, that means looking not just at who’s building the models but who’s building the machines that make them possible.
Hilary Frisch, CFA, is a senior analyst, software services & enterprise technology, at ClearBridge Investments