5 AI stocks that could be the next Nvidia — before Wall Street figures it out


  • NVIDIA (NVDA). Vertiv (VRT) $2.88B in Q4 revenue, orders up 252%. Marvell ( MRVL ) rose 42% on revenue of $8.2B. Astera (ALAB) $270.6M revenue up 92%. Micron’s ( MU ) revenue of $13.6B rose 57%. Lumentum (LITE) $665.5M revenue up 65.5%, NVIDIA invests $2B.

  • As AI data centers exceed 100,000 GPUs, the bottleneck shifts from chip delivery to physical infrastructure including cooling, power distribution, networking, memory, and optical connections.

  • An analyst named NVIDIA just named his top 10 AI stocks in 2010. Get it for free here.

NVIDIA (NASDAQ: NVDA ) is not only a trillion dollar company because it made the best chips, but it became one because it was the only company that could deliver what the AI ​​industry needed, exactly when it was needed. This type of structural dependence is what turns stocks into generational winners. As a result, the question every investor should be asking right now is straightforward: Who could be the next NVIDIA?

Even as the GPU shortage helped define 2023 and 2024, as AI data centers scale in 2026 and beyond, constraints are changing, and chips are being shipped. Instead, what’s in short supply right now is the physical infrastructure to make these chips useful. Cooling systems, high-speed networking, memory, and optical connectivity, just to name a few. Hyperscalers will spend hundreds of dollars in capex this year, and a growing portion of that money is going to companies that most investors ignore.

The bottom five stocks each sit at a peak in the AI ​​supply chain that literally didn’t exist on this scale three years ago. At best, these aren’t predictable games, because they have accelerating revenues, record backlogs, and direct ties to NVIDIA, but the market hasn’t fully priced what these positions are worth.

READ: The analyst named NVIDIA in 2010 Just naming his top 10 AI stocks

NVIDIA GPUs are useless without the infrastructure to power them, cool them, feed them data, and move signals between them at the speed AI demands. As clusters exceed 100,000 GPUs, the performance of each of these becomes a potential bottleneck, and companies that can help solve this become increasingly essential. This is the setting.

If it ships NVIDIA chips, it’s Virtue (NYSE: VRT ) that keeps them alive as the company that manufactures the power distribution and thermal management systems that an AI data center needs before every single GPU comes online. As these racks of GPUs go online and exceed 100 kW, their liquid cooling systems have moved from optional to mandatory.

The company’s Q4 2025 revenue indicated it reached $2.88 billion with a 252% increase in organic orders, and full-year revenue of $10.23 billion, while its backlog reached $15 billion year-over-year, up 109% year-over-year, with Q4 book-to-bill x 2.

NVIDIA’s parallel is more direct in the sense that as GPU supply cannot keep pace with demand in 2023 and 2024, cooling and power infrastructure is now the binding limit to how fast data centers can accommodate AI capacity. Vertue is guiding for 2026 revenues of $13.25 billion to $13.75 billion, which would beat Wall Street’s prior estimate of $12.4 billion.

NVIDIA owns the computers, but its Marvel Technologies (NASDAQ: MRVL ) owns the connectivity to scale AI clusters to hundreds of thousands of GPUs, the speed at which data moves between chips is as important as the chips themselves. Marvell’s customer AI ASICs, optical DSPs, and 1.6T interconnect solutions are the network fabric that helped hold some of these clusters together.

As far as the numbers go, fiscal 2026 revenue hit a record $8.2 billion, up 42% year-over-year, with data center products accounting for 74% of total sales and non-GAAP EPS growing 81%.

NVIDIA’s upcoming case with Marvell examines the company’s traditional silicon business, which has grown from nearly zero to $1.5 billion in annual revenue in a year. The company has 18 design wins with hyperscalers, including Microsoft and Amazon, and recently acquired Celestial AI for $3.25 billion to bring photonic interconnect technology in-house. In the case of Marvell, every major cloud company likely needs its chips to connect to its AI infrastructure, it is no longer just a supplier, but also a platform.

Each rack of NVIDIA GPUs requires an Astera Labs (NASDAQ: ALAB ) timer to function, and the company’s PCI/CXL Smart DSPs and cable modules ensure data flow between components with no signal degradation. It’s this functionality that helped Astera build the connective tissue inside the GPU rack.

Given its critical nature, it’s no surprise that Astera’s Q4 revenue reached $270.6 million, up 92% year-over-year, with a gross margin of 75.7% up from 74% in Q1 thanks to the $6.5 billion Amazon warranty deal and high-margin product mix tied to the Tapan model.

The stock itself is down nearly 60% from its 2025 peak as Amazon shares an estimated 200 basis point quarterly margin drag starting in Q2, but the structural position holds up well despite near-term profitability concerns. Astera is also developing shipments to additional hyperscalers, and its Scorpio fabric is targeting a connectivity market that is expected to reach $25 billion over the next five years.

With a market cap of $20 billion and revenue that nearly doubles annually, Astera has the name profile of an AI-based architecture.

AI servers require about three times the memory of standard servers, and the special high-bandwidth memory that powers NVIDIA GPUs is in a structural shortage that Micron (NASDAQ: MU ) is uniquely positioned to take advantage of. Micron is the only American HBM maker, and reported fiscal 2026 Q1 revenue of $13.6 billion, up 57% year-over-year, with non-GAAP EPS of $4.78, beating estimates by 20%. The kicker is that the company’s entire 2026 HBM supply has already been sold, including its next-generation HBM4.

Value is where NVIDIA’s comparison becomes compelling as the company guides Q2 revenue of $18.7 billion with EPS of $8.42, a number that represents 440% revenue growth, yet it trades at nearly 9x prior earnings.

Comparable AI infrastructure names trade at 25 to 30 times that, all while the HBM market is projected to grow from $35 billion in 2025 to $100 billion by 2028, and Micron can currently meet only 50% to two-thirds of its key customers’ demand. NVIDIA needs Micron’s HBM3E for its Blackwell platform and HBM4 for whatever comes next.

Without Lumentum’s ( NASDAQ:LITE ) lasers, NVIDIA’s scalable AI architectures wouldn’t work, and the company makes the optical components and co-packaged optics that NVIDIA platforms need to transmit data in massive clusters.

On March 2, 2026, NVIDIA confirmed this monopoly by investing $2 billion directly in Lumentum, with a multi-billion dollar purchase commitment for laser components. This is not a partnership announcement, instead, it is NVIDIA closing a supply chain that cannot be created without.

Looking at the numbers, Lumentum’s fiscal Q2 2026 revenue reached $665.5 million, up 65.5% year over year, and the company went from a net loss of $78.2 million to a net income of $1.1 million. So far, Q3 guidance calls for revenue of $780 million to $830 million, while the theoretical backlog is more than $400 million, and analysts are projecting fiscal 2026 revenue of around $2.6 billion. The stock is up more than 900% in the past year, so the valuation is an obvious risk, but with NVIDIA’s multi-year buyback commitment, it tells you whether Lumentum is in a strong position.

Wall Street is pouring billions into AI, but many investors are buying the wrong stocks. The analyst who first identified NVIDIA as a buyback in 2010 — before its 28,000% run — has identified just 10 new AI companies that he believes can deliver returns beyond that point. One dominates the $100 billion equipment market. Bill addresses the single biggest obstacle to maintaining AI data centers. The third segment is a net play in the optical network market that is quadrupling. Most investors haven’t heard of half of these names. Get a free list of all 10 stocks here.

Information centers

Add Comment