-
NVIDIA (NVDA). Vertiv (VRT) reported fourth-quarter revenue of $2.88B, with orders growing 252%. Marvell (MRVL) $8.2B Revenue up 42%. Astera (ALAB) reported revenue of $270.6 million, up 92%. Micron (MU) $13.6B revenue up 57%. Lumentum (LITE) revenue was US$665.5 million, an increase of 65.5%, and NVIDIA invested US$2 billion.
-
As AI data centers scale beyond 100,000 GPUs, the bottleneck shifts from chip supply to physical infrastructure, including cooling, power distribution, networking, memory, and optical interconnects.
-
The analyst who called NVIDIA in 2010 had just listed his top ten AI stocks. Get them for free.
NVIDIA (NASDAQ: NVDA ) is a multi-trillion dollar company not just because it makes the best chips, but because it seems to be the only company that can deliver what the AI industry needs when it needs it. This structural dependence makes stocks generational winners. So the question every investor should be asking right now is simple: Who could be the next NVIDIA?
While GPU shortages define 2023 and 2024, constraints are changing and chips are being shipped as AI data centers scale in 2026 and beyond. Instead, what’s in short supply now is the physical infrastructure to make these chips functional. Cooling systems, high-speed networking, memory and optical interconnects, and more. Hyperscalers will spend hundreds of dollars in capital expenditures this year, with an increasing amount of that money going to companies that most investors haven’t paid enough attention to yet.
The five stocks below all sit at a bottleneck in the artificial intelligence supply chain, and it’s fair to say that a supply chain of this scale did not exist three years ago. The bottom line is that these are not speculative moves as they have growing revenue, record order backlogs, and direct ties to NVIDIA, but the market has yet to fully price in the value of these positions.
Read: NVIDIA Analyst Calls in 2010 Just named his top 10 artificial intelligence stocks
NVIDIA GPUs are useless without the infrastructure to power them, cool them, provide data, and transmit signals between them at the speeds required by AI. As clusters scale beyond 100,000 GPUs, each of these capabilities becomes a potential bottleneck, and companies that can help solve this problem will become increasingly indispensable. That’s the setup.
If NVIDIA is supplying the chips, Vertiv (NYSE: VRT ) is keeping them alive, as the company is making the power distribution and thermal management systems every AI data center needs before a single GPU comes online. As these GPU racks come online and top 100 kilowatts, their liquid cooling systems have gone from optional to mandatory.
The company’s revenue in the fourth quarter of 2025 showed that its revenue reached US$2.88 billion, organic orders surged 252%, and full-year revenue reached US$10.23 billion, while its backlog of orders was US$15 billion, a year-on-year increase of 109%, and the order-to-bill ratio in the fourth quarter was 2.9 times.
The parallels with NVIDIA are more immediate, just as GPU supply cannot keep up with demand in 2023 and 2024, cooling and power infrastructure are now constraints on deploying AI capacity in data centers. Vertiv expects revenue of $13.25 billion to $13.75 billion in 2026, which would beat Wall Street’s previous forecast of $12.4 billion.
NVIDIA has the computing power, but its Marvell Technology (NASDAQ: MRVL ) has the connectivity, and as its AI clusters scale to hundreds of thousands of GPUs, the speed at which data can move between chips becomes as important as the chips themselves. Marvell’s customer AI ASICs, optical DSPs and 1.6T interconnect solutions are the network fabric that helps connect some of these clusters together.
In terms of data, revenue in fiscal 2026 reached a record $8.2 billion, a year-on-year increase of 42%, with data center products accounting for 74% of total sales, and non-GAAP earnings per share increased by 81%.
The next example of NVIDIA’s partnership with Marvell examines the company’s custom chip business, which grew annual revenue from near zero to $1.5 billion in one year. The company has won 18 design wins with hyperscalers including Microsoft and Amazon, and recently acquired Celestial AI for $3.25 billion to bring photonic interconnect technology in-house. Take Marvell, for example. Every major cloud company will likely need its chips to connect its AI infrastructure. It is no longer just a supplier, but a platform.
Each rack of NVIDIA GPUs requires Astera Labs’ (NASDAQ: ALAB) retimers to function, while the company’s PCI/CXL smart DSP and cable modules ensure data flows between components without any type of signal degradation. It’s this performance that helps Astera become the connective tissue inside a GPU rack.
Given its critical nature, it’s no surprise that Astera’s fourth-quarter revenue reached $270.6 million, up 92% year over year, with first-quarter gross margin rising from 75.7% to 74% thanks to Amazon’s $6.5 billion warrant agreement and a higher-margin product mix shift related to Taurus model expansion.
The stock itself is down nearly 60% from its 2025 peak as the Amazon partnership is expected to drag down quarterly margins by 200 basis points starting in the second quarter, but its structural position remains intact despite near-term profitability concerns. Astera has also expanded shipments to more hyperscalers, with its Scorpio architecture targeting a connectivity market expected to reach $25 billion over the next five years.
With a market capitalization of US$20 billion and revenue nearly doubling every year, Astera is an important brand in artificial intelligence infrastructure.
AI servers require roughly three times the memory of standard servers, and while Micron Technology (NASDAQ: MU ) is uniquely positioned to face a structural shortage of the specialized high-bandwidth memory that powers NVIDIA GPUs. Micron is the only U.S. HBM manufacturer. Its revenue in the first quarter of fiscal 2026 was US$13.6 billion, a year-on-year increase of 57%, and non-GAAP earnings per share were US$4.78, exceeding expectations by more than 20%. What’s more, the company’s entire supply of 2026 HBMs has been sold out, including the next-generation HBM4.
That valuation becomes compelling compared to NVIDIA, as the company expects second-quarter revenue of $18.7 billion and earnings per share of $8.42, a figure that represents 440% earnings growth, but trades at about 9 times forward earnings.
Similar AI infrastructure names trade at 25 to 30 times, and while the HBM market is expected to grow from $35 billion in 2025 to $100 billion in 2028, Micron can currently only meet 50% to two-thirds of demand from its major customers. NVIDIA’s Blackwell platform will require Micron’s HBM3E, and everything that follows will require HBM4.
NVIDIA’s scale-out AI architecture wouldn’t work without Lumentum’s (NASDAQ: LITE) lasers, and the company makes optics and co-packaged optics to move data across the massive clusters required for NVIDIA’s platforms.
NVIDIA directly verified this dependence on March 2, 2026, by investing $2 billion directly in Lumentum and committing to purchase billions of dollars of laser components. This isn’t a partnership announcement, rather, this is NVIDIA locking in a supply chain it couldn’t build without.
Judging from the data, Lumentum’s revenue in the second quarter of fiscal year 2026 reached US$665.5 million, a year-on-year increase of 65.5%. The company turned from a net loss of US$78.2 million to a net profit of US$1.1 million. Even better, third-quarter guidance calls for revenue of $780 million to $830 million, while the optical backlog exceeds $400 million and analysts expect fiscal 2026 revenue to be around $2.6 billion. The stock has surged more than 900% in the past year, so valuation is an obvious risk, but with NVIDIA committing to multi-year acquisitions, it could tell you whether Lumentum is in a strong position.
Wall Street is pouring billions into artificial intelligence, but most investors are buying the wrong stocks. The analyst who first identified NVIDIA as a buyback in 2010 (before the stock surged 28,000%) has just identified 10 new AI companies that he believes can deliver outsized returns from here. One dominates a $100 billion equipment market. The other is to solve the biggest bottleneck hindering the development of artificial intelligence data centers. The third is pure competition in the optical network market, which will quadruple in size. Most investors haven’t heard of half of these names. Get a free list of all 10 stocks here.
