-
Nvidia (NVDA) reported fourth-quarter revenue of $68.1B. The stock currently trades at 18 times 2027 earnings, compared with 20-22 times for the S&P 500.
-
Nvidia’s inference chip benchmarks faster than Google and Amazon. Inference AI models require 100 times more computation per task.
-
94% of analysts are optimistic about Nvidia. The consensus price target is $263.39 and the current price is $182.05.
-
The analyst who called NVIDIA in 2010 had just listed his top ten AI stocks. Get them for free.
Jim Cramer recently made a pointed appeal on the radio: NVIDIA (NASDAQ: NVDA ), that’s what he said.
“Look, there’s a lot of people just jumping on board…this is their opportunity to buy Nvidia. It’s not going to get cheaper. It’s hard to imagine it getting cheaper,” Cramer told new investors who were watching the stock from the sidelines.
Read: NVIDIA Analyst Calls in 2010 Just named his top 10 artificial intelligence stocks
So what is his real case? This comes down to two things: valuation and competitive positioning.
Cramer cited Morgan Stanley’s view that NVDA trades at about 18 times 2027 earnings, which the firm called a “surprisingly good entry point.” For context, the broader market typically trades at a P/E ratio of 20 to 22 times. If this framework holds true, then Nvidia is actually trading at Discount Even though it’s one of the most dominant growth companies in the world, it’s still based on the forward S&P 500 index.
Fundamentals support this. Nvidia just announced fourth-quarter fiscal 2026 revenue of $68.1 billion, significantly exceeding expectations. Revenue in the previous four quarters was $39.3 billion, $44.1 billion, $46.7 billion and $57 billion respectively. This is not a company that is slowing down.
The analyst community agrees. 94% of analysts covering NVDA are bullish, with a consensus price target of $263.39, compared to the current price of $182.05 as of March 2, 2026.
Beyond valuation, the conversation touched on something less talked about: Nvidia’s competitive position in the inference market. The discussion mentioned the partnership with Groq, and highlighted that Nvidia’s inference chips benchmarked significantly faster than similar products from Google and Amazon. Crucially, Nvidia also has cost-competitive inference options, meaning they’re not just winning the high-end arms race. They are competing across the full stack.
Chief Financial Officer Colette Kress made it clear during the last earnings call: “Our inference needs are accelerating, driven by test time expansion and new inference models. Long-term thinking inference AI may require 100 times more computation per task than one-time inference.”