Microsoft Continues To Expand Its Use Of Generative AI And Increase Revenue

Microsoft (MSFT) is progressively growing its investments to support ongoing research in this field and monetizing AI across its product line.

Due in part to Microsoft’s early AI accomplishments, the stock last week reached a fresh all-time high of $433.60. With a recent trade of $415, Microsoft shares are up 10.4% year to date.

Azure’s share of the cloud market is growing as more customers leverage Microsoft’s platforms and tools to create their own AI solutions. The number of significant, long-term Azure deals across the enterprise client base increased at an accelerated rate in the company’s fiscal Q3 (March). While the number of sales for $10 million or more more than doubled, the number of Azure deals worth $100 million or more increased by more than 80% year over year.

Microsoft Cloud revenue increased 23% to $35.1 billion in the third quarter. Azure’s 31% sales growth beat expectations by 300 basis points and outpaced the 28.6% consensus projection. Azure is predicted to expand by 30% to 31% in the fourth quarter (June). In response to the Q3 results, JP Morgan increased its price objective for Microsoft from $440 to $470, citing the possibility of faster Azure growth in the upcoming year.

Significantly faster than the 9% rise in FQ2, larger Azure transactions in the March quarter contributed to the overall commercial bookings growth to 31% in constant currency. $235 billion in commercial RPO increased 21% in constant currency, which is a significant increase from growth of 16% in FQ2. Microsoft exceeded the consensus estimate of $60.8 billion with a 17% increase in total sales to $61.9 billion.

Morgan Stanley thinks Microsoft has a lot of room to develop in the AI space because the innovation cycle is still in its early stages. Azure AI increased growth by around 700 basis points in FQ3, compared to a 600 basis point gain in the prior quarter. Morgan Stanley held onto its $520 price target for Microsoft.

With the company offering a distinctive growth profile at scale and the potential to grow both revenue and earnings by double digits in FY’25 (June), Goldman Sachs increased its target by $65 to $515. According to the company, Microsoft is well-positioned to gain a portion of the revenue from generative AI due to its extensive array of AI services and productivity-centric approach. In order to more accurately reflect the combination of factors that can support 25% growth through FY’25, Goldman Sachs increased their Azure projection.

Azure is now “a port of call for pretty much anybody who is doing an AI project,” according to Microsoft CEO Satya Nadella during the company’s third-quarter earnings call. AI is driving expansions throughout the installed base and attracting new users to Azure. Nadella also made the point that AI initiatives are not isolated. Naturally, calls to AI models are the first step in any AI project, but they also incorporate other services like vector databases, developer tools, and Azure Search.

In terms of breakthroughs in AI, Microsoft is emerging as a leader. Large language models, or LLMs, have been the center of interest lately, but because of their scale, they can be computationally demanding. A new class of more powerful small language models (SLMs) from Microsoft will increase accessibility to AI for a wider audience. Though they are trained on less data, these SLMs have many of the same features as LLMs.

The Phi-3 line of smaller models was just revealed by Microsoft. In a range of benchmarks that assess language, coding, and math skills, these SLMs perform better than models of the same size and the next size up. Organizations with minimal resources can more easily implement SLMs. They can be adjusted to suit particular requirements and are made to function properly for easier jobs. Clients can choose from a variety of models, big or tiny, depending on what best fits their needs.

Applications requiring the coordination of intricate processes involving sophisticated reasoning, data processing, and context awareness are better suited for LLMs. An SLM can be used by companies to create apps that can run locally on a device (as opposed to on the cloud) and when a task doesn’t require complex reasoning or a prompt response. SLMs are especially beneficial for sectors and companies that are subject to regulations and require high-quality outcomes yet prefer to retain data on-site.

Long-term, there is a chance to equip smartphones and other edge-operating devices with more powerful SLMs. This would entail AI-enabled traffic systems, automobile computers, intelligent manufacturing floor sensors, remote cameras, and environmental compliance monitoring equipment. By keeping data within a device, users are able to minimize latency and maximize privacy.

You might also like

Leave a Reply

Your email address will not be published. Required fields are marked *