After an impressive quarter by Nvidia (NVDA) and increasing strong guidance, we believe shares of the AI chip maker could gain another 14% from record highs in the next six to nine months. That’s generally the time horizon for the Club’s price target, which we’re increasing at Nvidia to $450 a share from $300. We hold our rating at 2 on the stock, which indicates that we are willing to wait for a pullback before buying more. Not kidding, right? Nvidia closed Wednesday at $305 a share before stunning financials that pushed shares nearly 29% to Thursday’s high of $394.80 per share. He is almost in the $1 trillion market capitalization club. Jim Cramer, an Nvidia fan since at least 2017, recently named it his second club stock. (Apple was the first). Jim even renamed his dog Nvidia. Our new $450-a-share price target on Nvidia is about 45 times our fiscal 2025 (or 2024) earnings year estimates. Nvidia has an odd financial calendar, and on Wednesday night it announced its results for the first quarter of fiscal 2024. While 45 times isn’t cheap based on a valuation of just over twice the current valuation of the S&P 500, it is slightly above the average valuation. 40 times what investors have placed on the stock over the past five years. In our opinion, it’s more than justified when factoring in the growth runway that Nvidia has ahead of it. That’s what we’re seeing Thursday as this latest round of upward ratings revisions also serves as a reminder that Nvidia has, more often than not, proven to be cheaper (or more valuable) than first thought because analysts have always been overly conservative about Nvidia’s potential nature. destabilizer, which is now fully emerging as the undisputed leader in the cards for running synthetic technology. NVDA 5Y Mountain Nvidia Performance for 5 Years Jim has been praising Nvidia CEO Jensen Huang for years — not to mention covering many of the already existing GPU technologies that have enabled the company to benefit from the explosion of AI. Into the consumer consciousness when ChatGPT spread this year. On a post-earnings call Wednesday night, management made it clear that they see things picking up later this calendar year. Although they don’t officially release guidance beyond the current quarter, the team said that demand for AI models and large languages expanded “our data center visibility for four quarters, and we had significantly higher supply for the second half of the year.” Management indicates that earnings in the second half of the year will be greater than in the first half. The demand they’re talking about is broad, coming from consumer internet companies, cloud service providers, enterprise customers, and even AI startups. Keep in mind that Nvidia’s first CPU is due later this year, with management noting that “at this week’s International Supercomputing Conference in Germany, the University of Bristol announced a new supercomputer based on Nvidia’s Grace chip.” CPU Superchip, which is six times more energy efficient than the previous supercomputer.” Energy efficiency is a major selling point. As we’ve seen in 2022, energy is a significant input cost when running a data center, so anything that can be done to reduce that will be very attractive to customers looking to boost their own profitability. The Omniverse Cloud is also on track to be available in the second half of the year. At a higher level, management spoke on the call about the need for global data centers to go through a significant upgrade cycle in order to handle the computing demands of generative AI applications, such as OpenAI’s ChatGPT. (Microsoft, also a club name, is a major supporter of Open-AI and is using the startup’s technology to power its new AI-powered Bing search engine.) “The entire world’s data centers are moving towards accelerated computing,” Huang said Wednesday night. This is a $1 trillion data center infrastructure that needs a revamp because it relies almost entirely on the CPU, which as Huang pointed out means “it’s basically unaccelerated.” However, with generative AI clearly becoming a new norm, and accelerated GPU-based computing being more energy efficient than non-accelerated CPU-based computing, data center budgets, Huang said, need to shift “significantly.” Towards accelerated computing we are seeing that now.” As mentioned in our guide on how the semiconductor industry works, the CPU is basically the brains of the computer, and is responsible for retrieving instructions/inputs, decoding those instructions, and sending them in order for an operation to be executed to achieve the desired result. On the other hand, GPUs are more specialized and are good at handling many tasks at once. Whereas the CPU will process the data sequentially, the GPU will decompose a complex problem into many small tasks and execute them simultaneously. Huang went on to say that fundamentally as we move forward, the capital expenditure budgets coming from data center customers will be very focused on generative AI and accelerated computing infrastructure. So, over the next five to 10 years, we stand to see what is now about a trillion dollars and increasing value of data center budgets shift very dramatically in favor of Nvidia as cloud service providers look for agile computing solutions. In the end, it’s really simple, all roads lead to Nvidia. Any significant company migrates workloads to the cloud – whether it’s Amazon Web Services (AWS), Microsoft Azure, or Google Cloud – and all cloud providers rely on Nvidia to support their products. Why nvidia? Huang noted on the call that, at its core, Nvidia’s value proposition is that it’s the lowest total cost proprietary solution. Nvidia excels in many areas which makes this so. It is a full stack data center solution. It’s not just about getting the best chips, it’s also about engineering and optimizing software solutions that give users the ability to maximize their use of the hardware. In fact, on the conference call, Huang called a networking suite called DOCA and an acceleration library called Magnum IO, commenting that “these two programs are some of our company’s crown jewels.” He added, “Nobody ever talks about it because it’s hard to understand but it makes it possible for us to connect tens of thousands of GPUs.” It’s not just about one chip, Nvidia excels at maximizing the entire data center architecture – the way it’s built from the ground up with all the parts working in unison. As Huang said, “It’s another way to think of the computer as the data center or the data center as the computer. It’s not the chip. It’s the data center and it’s never been like this before, and in this particular environment, the network operating system, your distributed computing engines , and your understanding of the architecture of the networking equipment, the switches, the computing systems, the computing fabric, and that whole system is your computer, that’s what you’re trying to run, and so in order to get the best performance, you have to understand the full stack, you have to understand the scale of the data center, and that is what accelerated computing is all about.” Utilization is another key component of competitive advantage for Nvidia. As Huang pointed out, a data center that could only do one thing, even if it could do it incredibly quickly, would be underutilized. However, Nvidia’s “universal GPU” is capable of many things – back to their huge software libraries – providing much higher utilization rates. Finally, there is corporate data center expertise. During the call, Huang discussed the issues that can arise when building a data center, noting that for some, construction could take up to a year. On the other hand, Nvidia has managed to improve the process. Instead of months or a year, he said, Nvidia can measure delivery times in weeks. This is a major selling point for customers who are constantly looking to stay on the cutting edge of technology, especially as we enter this new era of AI with a huge market share now up for grabs. Bottom line As we look to the future, it’s important to keep in mind that while ChatGPT was an eye-opening moment, or “iPhone moment” as Huang puts it, we’re only at the beginning. The excitement around ChatGPT has less to do with what it can actually do but more with a proof of concept of what is possible. The first generation iPhone, released 16 years ago from next month, was nowhere near what we have today. But it showed people what a smartphone can really be. What we have now, to extend the metaphor, is the original first generation iPhone. If you’re going to own Nvidia, and not trade it like we plan to, then you should – as impressive as generative AI applications are already – think less about what we have now and more about what this technology will be capable of when we get to “iPhone 14 versions” of Generative artificial intelligence. Here’s the really exciting (and somewhat scary) reason to hold shares of this AI-enabled juggernaut. (Jim Cramer’s Charitable Trust is long from NVDA, MSFT, AMZN, AAPL, and GOOGL. See here for a full list of stocks.) As a subscriber to the CNBC Investing Club with Jim Cramer, you’ll get a trade alert before Jim makes a trade. Jim waits 45 minutes after sending a trade alert before buying or selling a share in his charity fund portfolio. If Jim talks about a stock on CNBC, he waits 72 hours after the trade alert is issued before executing the trade. The above investment club information is subject to our terms and conditions and privacy policy, along with our disclaimer. No fiduciary obligation or duty will be created, or created, by virtue of your receipt of any information provided in connection with Investment Club. There are no specific results or guaranteed profit.
Nvidia CEO Jensen Huang in his usual leather jacket.
GT
the next nvidiaStunning (NVDA) quarter and significant increase in guidance We believe shares of the strong AI-chip could increase another 14% from record highs in the next six to nine months.
More Stories
Bitcoin Fees Near Yearly Low as Bitcoin Price Hits $70K
Court ruling worries developers eyeing older Florida condos: NPR
Why Ethereum and BNB Are Ready to Recover as Bullish Rallies Surge