Nvidia (NVDA) has been witnessing such strong demand for its graphics processing units (GPUs) to support artificial intelligence (AI) applications that its revenue and earnings growth have taken off.
In its most recently reported quarter, the semiconductor giant's revenue doubled year over year to $13.5 billion. Earnings grew by 429% to $2.70 per share. Nvidia is expected to deliver stronger growth in the current quarter as its guidance for revenue of $16 billion points to a 170% year-over-year increase. Its adjusted earnings are anticipated to increase by a whopping 475% from the prior-year period to $3.34 per share.
However, investors who have missed out on Nvidia's eye-popping stock surge of 205% so far in 2023 may be a bit apprehensive about buying it now. That's because the company is trading at an extremely rich 34 times sales and has a price-to-earnings ratio of 108. Nvidia's forward earnings multiple of 55 anticipates healthy growth in earnings, but with concerns rising on Wall Street about the sustainability of its red-hot rally, some investors might prefer to buy a relatively cheaper AI stock to take advantage of this fast-growing technology.
Advanced Micro Devices (AMD) is one option that AI-focused investors should be taking a closer look at. AMD stock is not only substantially cheaper than Nvidia, but it could surge big time thanks to the key role the company could play in the proliferation of AI. Shares of AMD are up almost 60% in 2023, and they jumped by almost 5% on Sept. 28 after its AI chip development efforts got a vote of confidence from Microsoft(MSFT).
It won't be surprising to see AMD stock go parabolic. A parabolic move refers to a rapid surge in a company's stock price within a short period, similar to the sharply steepening rise in a positive parabolic curve. Let's see why that may be the case for AMD.
AMD's AI-related revenue may be about to take off
Microsoft Chief Technology Officer Kevin Scott recently pointed out at a conference in California that AMD is “making increasingly compelling GPU offerings that I think are going to become more and more important to the marketplace in the coming years,” reported CNBC.
Nvidia is the dominant player in the market for AI chips with an estimated market share of more than 70%. Other estimates put Nvidia's share of the AI chip market between 80% and 95%. This means Nvidia is currently in pole position to capitalize on the AI chip market, which is expected to generate a whopping $304 billion in annual revenue by 2030, compared to just $29 billion last year.
But Scott's statement indicates that cloud service providers — which are in a race to deploy AI chips to train large language models — are gaining confidence in AMD's products.
AMD CEO Lisa Su's comments during the company's August earnings call further suggest that its AI chips are gaining traction. More specifically, Su said that AMD witnessed a 7x jump in customer engagements for its AI programs, which could lead to “future deployments of Instinct MI250 and MI300 hardware and software at scale.”
It is worth noting that AMD's MI250 accelerator is 80% as fast as Nvidia's A100 processor, which was deployed for training ChatGPT. Meanwhile, the upcoming MI300 accelerators, which are on track to go into production in the fourth quarter, are reportedly going to pack more powerful specs than Nvidia's offerings, suggesting that AMD is set to go aggressively after the lucrative AI chip market.
More importantly, Microsoft isn't the only one that's giving AMD's AI chips a vote of confidence. AI start-up Lamini, which provides a large language model platform to enterprises for building and deploying custom models based on their own data and infrastructure, chose AMD's graphics cards to power its hardware.
Lamini's platform is reportedly being powered by more than 100 AMD GPUs. The start-up says that it can scale up its platform to thousands of AMD GPUs. The good part is that AMD is taking steps to shore up the manufacturing of its AI chips with the help of its foundry partner, Taiwan Semiconductor Manufacturing, popularly known as TSMC.
Taiwan-based trade publication DigiTimes reports that AMD could corner one-third of TSMC's output of AI-focused wafers in 2024, with the other two-thirds going to Nvidia. In other words, AMD's AI chip sales could be half of Nvidia's in 2024, which would drive a significant acceleration in AMD's top line.
Nvidia is expected to sell AI chips worth $25 billion to $30 billion in 2023, according to Japanese investment bank Mizuho. If AMD could clock even half of that in 2024, its top line could get a big boost from the $23 billion in revenue it is expected to generate this year.
AMD is an attractively valued AI stock now
AMD is currently trading at 7.6 times sales — significantly cheaper than Nvidia's price-to-sales ratio. Its forward earnings multiple of 39 is also well below Nvidia's multiple. Analysts are expecting a 20% surge in AMD's revenue and a 50% bump in earnings per share next year — results it may be able to exceed if its AI-focused efforts pay off.
According to the consensus estimate of 40 analysts, AMD's median price target of $145 for the next year points toward a 42% upside. But the Street-high price target of $200 suggests that AMD could nearly double from current levels. So there is a chance that this AI stock could go parabolic and deliver eye-popping gains to investors, which is why they would do well to buy the stock before it heads higher and becomes expensive.
Originally published on Fool.com