Skip to Content

Microsoft Bets Big on AI: What Investors Should Know

What companies the tech titan is investing in, and whether the stock is worth adding to your portfolio.

Microsoft Bets Big on AI: What Investors Should Know, Part One

Ivanna Hampton: Welcome to Investing Insights. I’m your host, Ivanna Hampton.

The age of AI appears to have arrived. OpenAI’s popular chatbot is answering questions from millions of people, and Microsoft MSFT has invested billions of dollars in the startup. The tech titan has also taken an early lead in the AI race. But others are also benefiting from the craze. Morningstar Research Services’ senior equity research analyst Dan Romanoff has written a report on Microsoft’s AI bet. Dan is joining the podcast.

OpenAI’s Origin Story

Dan, let’s kick off this conversation with OpenAI’s origin story. Talk about the founders, their original mission, and where they are today.

Dan Romanoff: So, in 2015 or thereabouts, you had a group of tech visionaries and luminaries and academics and accomplished professionals in AI, and those names include Sam Altman, who is one of the co-founders. He is the CEO now. Elon Musk, Peter Thiel. Greg Brockman is one of the AI minds that was behind all of this. And then, there is just a long list of who is who kind of investors. AWS was involved, Infosys was involved. So, there were large corporate VC entities. And that’s the names of it. The initial mission was to make AI friendly and make AI benefit all of humanity. And that was something that they stressed repeatedly over and over for a few years. That was the foundational elements of it and some of the people involved. It’s supposed to be a nonprofit. And I think that’s also an important point in the story. Not that it matters so much, but eventually you had Microsoft come in and invested $1 billion initially a few years ago and then more recently, it’s believed to be $10 billion. So, big numbers are being thrown around. At some point, they changed it to a for-profit, like a capped for-profit, so Microsoft can earn a return on it, but it’s limited to 100 times their initial investment. So, there’s plenty of upside for Microsoft. So, it’s not a nonprofit anymore. So, that has evolved, and there’s been some criticism of that. But realistically this is a very expensive prospect. And you can’t just on the good graces of others suck in all of these billions of dollars and people not hope to make a return on what is the next thing. So, that’s where we’re at now.

There’s been movement. Elon Musk is not involved anymore, for example. He has recently started his own AI company. He was concerned about overlap between hiring the same people at Tesla to do self-driving cars and AI. So, there’s been some changes along the way. But it’s not a not-for-profit anymore, and at this point, OpenAI is pretty much a household name. So, that’s definitely an evolution as well.

OpenAI Dropped ChatGPT-4

Hampton: And a big moment happened last year. OpenAI dropped ChatGPT-4. Millions of people tried it out. What made that release a groundbreaking moment in AI?

Romanoff: In November 2022, it was just one day, “By the way, here’s ChatGPT.” And as a tech person, I’m like, “OK, cool. Let me try this.” I think that’s what millions of people did. In fact, I think it took, they said, five days to get a million users. That is a crazy statistic. I won’t call this a social-media product, but it’s just like a viral usage instance. That is a huge number of users. And a couple of months later, it was 100 million users. And again, that is way beyond anything that Facebook or whoever was doing. So, the popularity and the speed at which this took off is crazy. I think it’s because we’ve all had experiences with chatbots before—Verizon, Comcast. I have a customer service issue. It’s not a great experience, right? It’s very robotic. It’s very rigid. And you’re suffering through answering these questions you know are not relevant, right? And all of a sudden, I’m interacting with ChatGPT. People are interacting with a highly advanced AI product. And it is sort of groundbreaking because there it is laid bare for everyone on Nov. 30 that this is the future. And it’s so far beyond what we had seen before from a chatbot or any of our limited experiences with AI. And it was a lot closer to 2001, like The Terminator, not that there’s a robot chasing me around. But it’s like you can sort of start to bridge that gap as a person. So, I think a lot of people are doing that. And maybe that’s far-fetched, but you can sort of see it now, whereas before you’re like, “Yeah, yeah, sure.”

Hampton: What did you think of it when you tried it out?

Romanoff: Oh, I mean, I played with it for hours. My wife was giving me such a hard time, she was getting so mad at me. I just sat there asking ChatGPT to write TV pilots and to write songs and to take on the persona of this person and then write a speech about this topic. It was a lot of fun. I was blown away by the capabilities. And on that same night—and I literally sat there for hours—on the same night I played around with DALL·E, which is their image generation tool, OpenAI’s image generation tool. And it’s kind of the same thing. I’m just blown away. You can use text inputs, like “Draw me a picture of Christopher Walken staring at a sunset in an impressionist style.” I put it in a report I wrote because it was so funny. I was trying to show my wife, “Look at this, look what it did.” And then, you can start editing it all on the fly, again, just using natural text. So, the capabilities were amazing. And it was just like this free tool. And as a tech person, I immediately grasped some of the implications of what will come from this. But just as a human, I’m like, this is so fun. I mean, that was my experience with it. But just from a technology standpoint, it’s truly amazing. I appreciate the fun element, but also the advancement made.

Hampton: I’m glad you’re able to express your creativity using OpenAI.

Romanoff: My kids laughed more than my wife did. That’s for sure.

Microsoft and OpenAI Partnership

Hampton: Dan, Microsoft and OpenAI, they formed this partnership. Microsoft invested billions of dollars. What is Microsoft getting in return?

Romanoff: Right. Microsoft gets two things, I think, out of this partnership. The first is pretty obvious. They get the ability to commercialize certain OpenAI products, early versions of the large language model around GPT-4 now. So, Microsoft gets to infuse that into their own software. They’re doing that as rapidly as humanly possible. You already see a couple of products actually generating revenue for Microsoft. And then, secondly—this is important to me—they get an exclusive tie-up where all of OpenAI’s AI instances run off of Microsoft Azure. So, to me, that is almost the more important part is they get all of the knowledge from running AI models and what it takes to do that in the cloud. There’s a lot of learning, a lot of intangible assets, a lot of knowledge-based assets there. So, that’s really important for developing a lead in a technology product, and they’re definitely ahead of everyone at this point. I think those are the two things that I would highlight.

Hampton: But the deal is not exclusive. Talk about what else is worthwhile about it.

Romanoff: The exclusive part is that all of OpenAI’s instances run on Azure. So, that is exclusive. What’s not exclusive is OpenAI still can go around selling their API, which is, basically access to the large language models that they’ve created. They can go sell that to anyone they want. So, you see, basically every one of my companies, I hear the management team saying, “Yeah, we’re introducing a ChatGPT sort of instance in whatever product we’re selling.” And you hear it from pretty much everyone. I mean, from Salesforce to RingCentral, everyone is talking about it. So, that’s what OpenAI is doing. But Microsoft gets the access to what everyone is doing with AI. If OpenAI stuff is all running on Azure, there’s a lot of knowledge there, so a lot of learnings that other companies just don’t have access to and won’t have access to. So, I think that is really important, and that’s probably the main thing.

What Is AI’s Market Size?

Hampton: PitchBook has estimated that the AI market size is set at $2.5 billion last year. Now, others are saying that it’s worth a trillion dollars. What’s your view?

Romanoff: I mean, a trillion dollars is kind of nuts, right? I think we all—software analysts, anyway—tend to agree that the market for software is around $800 billion. So, to say that the market for AI is a trillion dollars is totally nuts, right? It’s like basically all of software at that point, and that’s clearly not true. So, $2.5 billion is probably closer to the truth for generative AI, and that’s what we’re talking about here. But AI has been around for a while. So, if you just tried to more generally size the AI market, you probably could come up with something that is, I don’t know, at least probably in the $15 billion-$20 billion range. And that would include stuff like Salesforce’s Einstein and Adobe’s Sensei. Those are pretty powerful, but previous-generation AI instances that help with decision-making and analytics and make predictions. But you see AI everywhere. So, Google search, Amazon search, if you’re looking for a product, all of that is algorithm-driven and AI-infused. How much revenue do we include from AWS, and how much do we include from Google? It’s not really a separate revenue line. So, probably you shouldn’t include much direct revenue. But it is a market that exists and is substantial already. But the generative AI stuff, yeah, is pretty small. I mean, OpenAI was reportedly generating $50 million in revenue last year, which is basically nothing. So, this is a very nascent technology. It’s very exciting. I’m sure it will be pretty big within a few years. But we’re talking billions of dollars, maybe $10 billion within a few years, but certainly not a trillion dollars.

Hampton: So, just not yet.

Romanoff: No, no.

Nvidia’s Role in Microsoft’s AI

Hampton: Well, Microsoft has made several AI investments, including in Nvidia NVDA. The chipmaker stock has skyrocketed this year. What role is Nvidia playing in Microsoft’s pursuit of AI dominance?

Romanoff: This applies a little more broadly than just Microsoft because Nvidia, they basically make the most powerful and the only GPUs really. So, these graphics processors just have a way of serially processing data, which is more efficient than linear processing that you’d get from just say an Intel desktop processor. That’s the super high-level difference. But if I were to build a computer at home, which we do, I can go pay a couple hundred bucks and get a pretty nice Intel desktop processor. If I wanted to go buy the latest Nvidia GPU, it would be loosely $40,000. So, there’s a little bit of a different price point. I’m not the Nvidia analyst, but you saw the quarter they just put up. I mean, the stock was up 30% or whatever it was. They just crushed it. And basically, everyone from Google to Microsoft to Amazon, if you want to do anything in model training or AI running inferences, but especially on the training side, you need processing power like that because you basically are going to run racks of servers for a month straight to ingest all of this data. It’s 24/7, just like so much data coming in, basically the entire internet coming in, and it takes a long time to process that. So, you need basically the most efficient processing capability. That’s what Nvidia brings to the table. And if you’re Amazon, if you’re Google, and now Microsoft as well, they’re developing their own processors basically that can do that task more efficiently.

So, what ends up happening if you’re Microsoft or if you’re Amazon—you have Azure, AWS—you have a menu of services that you offer for AI. And if I’m the person who wants to use Azure to train one of these large language models, I can pick which processor I want to do it on. If I want to do it on an Nvidia processor, it’s going to cost me this much per hour. If I want to do it on an Intel processor, it’s going to cost me this much. If I want to do it on an internally developed TPU from Google or the Trainium—that’s Amazon’s internally developed chip—if I want to do it on the Trainium, it’s going to cost this much. They all have different performance and price parameters. So, that’s the difference. But in terms of doing these large-language-model training instances, the flexibility that a GPU offers from Nvidia is important because you don’t necessarily know the direction that the training is going to take or what, say, problems, for lack of a better word, will come up. So, the GPU offers the most flexibility. And this is also new. There is a lot of uncertainty. I’ve heard it said from pretty advanced people in the AI community that no one really knows how these AI instances work. No one is responsible for the whole thing. You got a bunch of engineers that are teaching something how to think or produce answers. So, no one really knows exactly what’s going to happen when you start unleashing all this data into a model you created. So, the flexibility offered by an Nvidia GPU helps cope with some of that.

Hampton: So, its advantage is the flexibility and the power that it provides?

Romanoff: Yes. You actually should do the answers because you’re much faster and much more succinct than I am.

Hampton: I am listening to you. You are educating me and the audience, and I appreciate that. What is Morningstar’s long-term view on Nvidia?

Romanoff: Well, I mean, From a stock perspective, I’m not sure where we even are on that. I know our analysts just from a company perspective would echo—or maybe I’m echoing what they’re saying—but they would say the same thing. They are basically the single source for high-end GPUs. The only other option there is really AMD, which owns ATI. Those are the two. So, it’s very high-end stuff. From a stock perspective, quite honestly, I’m not sure where they fall on that.

Hampton: Well, our audience can go to Morningstar.com to get some more details on that.

That was part one of Microsoft Bets Big on AI: What Investors Should Know. Dan will share his long-term outlook on Microsoft in part two. Thanks to him for providing insights on the AI race including Nvidia’s resurgence. I’m also thanking video producer Daryl Lannert. I appreciate you for checking out Investing Insights. Subscribe to Morningstar’s YouTube channel to see new videos from our team. You can hear market trends and analyst insights from Morningstar on your Alexa devices; say, “Play Morningstar.” I’m Ivanna Hampton, a senior multimedia editor at Morningstar. Take care.

Read About Topics From This Episode

After Earnings, Is Microsoft Stock a Buy, a Sell, or Fairly Valued?

Microsoft and AI: Believe the Hype?

Microsoft Earnings: Solid Performance, With Azure Strength Offset by Margin Pressure in 2024

Before Investing in AI Stocks, Consider This

The author or authors do not own shares in any securities mentioned in this article. Find out about Morningstar’s editorial policies.

More in Stocks

About the Author

Sponsor Center