AMD Could Fail in Artificial Intelligence for 1 Reason | The Motley Fool

6 Min Read

You’re reading a free article with opinions that may differ from The Motley Fool’s Premium Investing Services. Become a Motley Fool member today to get instant access to our top analyst recommendations, in-depth research, investing resources, and more. Learn More

Demand for powerful chips capable of training and running advanced artificial intelligence (AI) models, particularly large language models like OpenAI’s ChatGPT, is exploding. Some estimates put the market for AI chips growing to nearly $400 billion by 2032. For comparison, the market for CPUs totaled about $108 billion last year.

While outlooks for AI chip demand a decade from now may prove overly optimistic, demand will almost certainly rise substantially in the years ahead. AI is a transformative technology that will find its way into nearly every industry.

Nvidia (NVDA 2.40%) has been the prime beneficiary of the booming demand for AI chips. The company isn’t quite the only game in town, but it has a key advantage that will make it a slog for any competitor to catch up. Advanced Micro Devices (AMD 4.21%) is planning to launch an updated slate of high-powered data center GPUs targeting the most computationally intense AI workloads, but even if the hardware is top-notch, the software could hold the company back.

Analysts at Baird greatly reduced their price target for AMD stock on Thursday , with the company’s AI potential being the main sticking point. While Baird kept an “outperform” rating on the stock, its price target was dropped from $170 all the way to $125.

Baird is worried about two things. First, while AMD has said its customer engagements around its AI chips soared 7-fold in the second quarter, locked-in design wins outside of the supercomputer market haven’t yet materialized. That could change as AMD gets closer to launching its new MI300 family of AI chips, but an AI tailwind has yet to materialize.

Second, Baird believes that Nvidia’s mature software ecosystem will be tough to overcome. While Nvidia’s incredible success selling its data center GPUs for AI applications really only kicked off this year, the company has been plugging away for nearly two decades. Nvidia first launched its CUDA compute platform back in 2006, allowing developers to harness its GPUs for a wide variety of computationally intensive tasks.

The ecosystem around CUDA now includes more than 150 libraries, software development kits, and tools. CUDA is at the foundation of thousands of applications built over the years. Nvidia’s platform is pervasive, and it only works with the company’s GPUs.

AMD is looking to shake things up with its own open compute platform, called ROCm. This platform has been around since 2016, but AMD is now pushing the platform as an alternative to NVIDIA’s proprietary offerings for AI workloads. The latest version of ROCm does support a variety of popular AI frameworks, including TensorFlow and PyTorch, but being a decade behind CUDA, it’s just not as mature.

Nvidia’s CUDA software ecosystem will slow down any attempt by a competitor to make a splash in the AI chip market. Baird only expects AMD to win a 5% share of the AI chip market after 2024. Not only is AMD dealing with Nvidia, but it also must contend with Intel. Intel is attacking the AI market on multiple fronts, including AI hardware built into its data center CPUs, data center GPUs, and its specialized line of Gaudi AI accelerators. Intel has a huge data center install base advantage over AMD, which could act as an additional headwind.

Shares of AMD have surged this year, partly because of the excitement around AI. But it’s far from a guarantee that the company will be able to successfully break into the AI chip market in a big enough way to move the needle. Nvidia’s dominance isn’t an accident. The market leader has been iterating and improving its data center GPUs and software ecosystem for a very long time.

While AMD is at a severe disadvantage, demand may be strong enough for the company to succeed, regardless. With Nvidia’s high-powered data center GPUs in short supply, there’s a strong incentive among buyers of AI chips to foster viable alternatives. If AMD’s upcoming data center GPUs can come close to matching Nvidia’s leading products in terms of performance, potential customers may be willing to accept the software downsides in order to secure the hardware they need to train AI models.

AMD has the potential to be a major player in the AI chip market, but it’s far from a sure thing.

Share This Article
By admin
test bio
Leave a comment