Nvidia Reports Earnings Next Week, so is it a Buy, a Sell, or a Hold?

7 Min Read

We remain focused on artificial intelligence (AI) and Nvidia’s data center business. Nothing else matters.

In May, Nvidia provided investors with a shockingly-upbeat forecast for the July quarter. It shattered those expectations and guided for a significant increase in the October quarter. We expect Nvidia to beat this guidance, and we’ll see by how much.

We still believe hyperscalers are rushing to buy as many Nvidia graphics processor units (GPUs) as they can in order to train large language models like ChatGPT, both for themselves and for their cloud customers.

All eyes will be on the guidance for the January 2024 quarter.

Demand still appears to be well ahead of supply, so the guidance may imply how great that difference currently is, along with whether is Nvidia still constrained by supply in building data center GPUs.

Nvidia has argued the latest round of China restrictions will not have a meaningful impact on its business. It’s possible that this is because China frontloaded its GPU orders earlier this year in anticipation of these restrictions. It’s also possible any lost revenue from China will be made up for by tremendous revenue growth in developed markets. Guidance for the January quarter might be indicative here.

With its 3-Star rating, we believe Nvidia’s stock is fairly valued compared with our long-term fair value estimate.

Our fair value estimate is $480 per share, which implies an equity value of over $1.1 trillion. Our fair value estimate implies a fiscal 2024 (ending January 2024) price/adjusted earnings multiple of 45 times and a fiscal 2025 forward price/adjusted earnings multiple of 31 times.

Both our fair value estimate and Nvidia’s stock price will be driven by the firm’s prospects in data centers and AI GPUs, for better or worse. We anticipate a massive expansion in the AI processor market in the decade ahead. We see room for both tremendous revenue growth at Nvidia and competing solutions at either external chipmakers (like Advanced Micro Devices [AMD] or Intel [INTC]) or in-house solutions developed by hyperscalers (such as chips from Alphabet [GOOGL], Amazon [AMZN], or others).

Nvidia’s data center business has already achieved exponential growth, rising from $3 billion in fiscal 2020 to $15 billion in fiscal 2023. The company should see an even higher inflection point in fiscal 2024, as we expect data center revenue to more than double to $41 billion. In gaming (which was formerly Nvidia’s largest business) we model $9.8 billion in revenue in fiscal 2024, nearly $11 billion of revenue in fiscal 2025, and 10% average annual revenue growth thereafter. We have high hopes for Nvidia’s automotive business, as greater processing power will be required in active safety systems and autonomous driving.

We assign Nvidia a Wide Economic Moat, thanks to intangible assets around its GPUs and, increasingly, switching costs around its proprietary software, such as its Cuda platform for AI tools, which enables developers to use Nvidia’s GPUs to build AI models.

Nvidia was an early leader and designer of GPUs, which were originally developed to offload graphic processing tasks on PCs and gaming consoles. The firm has emerged as the clear market share leader in discrete GPUs (over 80% share, per Mercury Research). We attribute Nvidia’s leadership to intangible assets associated with GPU design, as well as the associated software, frameworks, and tools developers need to work with these GPUs.

We don’t foresee any companies becoming additional relevant players in the GPU market alongside Nvidia and AMD. Even Intel, the chip industry behemoth, has struggled for many years with building a high-end GPU that would be adopted by gaming enthusiasts. Its next effort for a discrete GPU is slated to launch in 2025.

In our view, GPU parallel processing capability is at the heart of Nvidia’s dominance in its various end markets. PC graphics were the initial key application, allowing for more robust and immersive gaming. Cryptocurrency mining also involves many mathematical calculations that can run in parallel, so GPUs have an edge here as well.

In the past decade, GPUs were found to more efficiently run the matrix multiplication algorithms needed to power AI models. Nvidia made shrewd moves to build and expand the Cuda software platform, creating and hosting a variety of libraries, compilers, frameworks, and development tools that allowed AI professionals to build their models. Cuda is proprietary to Nvidia and only runs on its GPUs, and we believe this hardware plus software integration has created high customer switching costs in AI, contributing to Nvidia’s wide moat.

Read more about Nvidia’s Moat Rating

We assign Nvidia a Morningstar Uncertainty Rating of Very High. In our view, the firm’s valuation will be tied to its ability to grow within data centers and AI. Nvidia is an industry leader in GPUs used in AI model training, and it’s carved out a good portion of the demand for chips used in AI inference workloads.

We see a host of tech leaders vying for Nvidia’s leading AI position. We think it is inevitable that leading hyperscale vendors, such as Amazon’s AWS, Microsoft [MSFT], Alphabet, and Meta Platforms [META], will seek to reduce their reliance on Nvidia and diversify their semiconductor and software supplier base, including the development of in-house solutions. Our rating is based on the uncertainty around this market. Nvidia dominates AI today, and the sky is the limit for its profitability if it can maintain this lead over the next decade. However, any semblance of the successful development of alternatives could meaningfully limit the company’s upside.

Share This Article
By admin
test bio
Leave a comment
Please login to use this feature.