Microsoft is rolling out its second-generation AI chip, the Maia 200, with the aim of reducing dependence on Nvidia hardware while improving efficiency across cloud and AI services.
The processors are manufactured by Taiwan Semiconductor Manufacturing and are being deployed at Microsoft data centers in Iowa, with additional deployments planned for the Phoenix area.
Some of the first chips will be allocated to Microsoft’s superintelligence team to generate data for future AI model development, cloud and AI chief Scott Guthrie said in a blog post.
Maia 200 also powers Copilot for enterprise customers to support AI models, including those from OpenAI, delivered through Microsoft’s Azure cloud platform.
Microsoft began developing its own chips several years after Amazon and Alphabet began in-house silicon efforts aimed at lowering costs and improving performance.
The scarcity and high prices of Nvidia’s latest processors has increased competition among cloud providers to secure alternative sources of computing power.
The company says the Maia 200 outperforms comparable chips from Amazon Web Services and Google on some AI workloads and is the most efficient inference system Microsoft has deployed to date.
Microsoft is already designing a successor, the Maia 300, and retains access to OpenAI’s early-stage chip designs under a strategic partnership.
Microsoft shares rose nearly 2% on Monday as investors looked ahead to the company’s second-quarter earnings report on Wednesday. The software maker’s market capitalization remains close to $3.85 trillion.
Meanwhile, NVIDIA stock was trading down about 0.5% on the day as the chipmaker announced a $2 billion investment in CoreWeave.

