Microsoft (MSFT) is launching its next generation of custom artificial intelligence chips, taking aim at cloud rivals Amazon (AMZN) and Google (GOOG, GOOGL).
The chip, called Maia 200, will run in Microsoft’s own data centers before the company eventually makes it available to a wider customer base.
Like Google’s TPU and Amazon’s Trainium processors, Microsoft’s second AI chip is designed to give the Windows maker more flexibility in how it supports its AI services. By using its own in-house developed chips, the company ensures that it doesn’t have to rely entirely on processors developed by Nvidia (NVDA) or AMD (AMD).
Google and Amazon have been using their own custom chips for years, while Microsoft has been slow to adopt in-house AI chips.
According to Microsoft, the Maia 200 will be manufactured using TSMC’s 3nm process and is designed to run large-scale artificial intelligence workloads while “delivering efficient price/performance.”
Maia 200 will be built into large server racks, with four chips per rack. Microsoft also touts how quickly it can deploy new chips into data centers, saying the chips can be installed and run on artificial intelligence models within days of the parts arriving.
Getting AI servers up and running quickly is an important aspect of the wider data center business. It’s not just a matter of reducing construction costs, either. The longer a chip sits idle, the less cash it generates for the company by running AI applications.
Maia 200 adds to the growing competition Nvidia faces from AMD and its own customers. Microsoft’s Maia 100 already powers AI models from the company and OpenAI (OPAI.PVT), while Google and Amazon both power their own models and those from Anthropic (ANTH.PVT).
In November, The Information reported that Meta was in discussions with Google about using the search giant’s TPUs in its own data centers to power its artificial intelligence services. This caused Nvidia’s stock price to drop at the time, as Wall Street worried that the company was in danger of losing market share.
Nvidia shares are up less than 1% since the beginning of the year.
While Google, Amazon and Microsoft have encroached on Nvidia’s turf, they are unlikely to pose a serious threat to the AI ​​leader. Experts say that while cloud companies’ AI chips may work well within their own services, they are unlikely to translate easily to smaller third-party customers.
Nvidia’s chips are also highly sought after because they are designed to be versatile, allowing companies to use them for a range of applications and services.
As for performance, the Maia 200 won’t replace Nvidia, but Microsoft claims it surpasses Google’s latest TPU and Amazon’s latest Trainium chip in many categories. The Maia 200 also comes with more high-bandwidth memory than Google or Amazon products, which is key to running high-performance AI applications.