Asian American Daily

Subscribe

Subscribe Now to receive Goldsea updates!

  • Subscribe for updates on Goldsea: Asian American Daily
Subscribe Now

Microsoft Maia 200 Joins Wave of Competition for Nvidia's AI Chips
By Reuters | 26 Jan, 2026

The Maia 200 will join AI chips from Google and Amazon as potential alternatives in a segment in which Nvidia is likely to retain its dominant position with the rollout of its Vera Rubin GPU.

Microsoft on Monday unveiled the second generation of its in-house artificial intelligence chip, along with software tools that take aim at one of Nvidia's biggest competitive advantages with developers.

The new "Maia 200" chip comes online this week in a data center in Iowa, with plans for a second location in Arizona, Microsoft said. It is the second generation of an AI chip called Maia that Microsoft introduced in 2023.

The Maia 200 comes as major cloud computing firms such as Microsoft, Alphabet's Google and Amazon.com's Amazon Web Services - some of Nvidia's biggest customers - are producing their own chips that increasingly compete with Nvidia. 

Google, in particular, has garnered interest from major Nvidia customers such as Meta Platforms, which is working closely with Google to close one of the biggest software gaps between Google and Nvidia's AI chip offerings.

For its part, Microsoft said that along with the new Maia chip, it will be offering a package of software tools to program it. That includes Triton, an open-source software tool with major contributions from ChatGPT creator OpenAI that takes on the same tasks as Cuda, the Nvidia software that many Wall Street analysts say is Nvidia's biggest competitive advantage.

Like Nvidia's forthcoming flagship "Vera Rubin" chips introduced earlier this month, Microsoft's Maia 200 is made by Taiwan Semiconductor Manufacturing Co using 3-nanometer chipmaking technology and will use high-bandwidth memory chips, albeit an older and slower generation than Nvidia's forthcoming chips.

But Microsoft has also taken a page from the playbook of some of Nvidia's rising competitors by packing the Maia 200 chip with a significant amount of what is known as SRAM, a type of memory that can provide speed advantages for chatbots and other AI systems when they field requests from a large number of users.

Cerebras Systems, which recently inked a $10 billion deal with OpenAI to supply computing power, leans heavily on that technology, as does Groq, the startup that Nvidia licensed technology from in a reported $20 billion deal.

(Reporting by Stephen Nellis in San Francisco; Editing by Jamie Freed)

Microsoft logo is seen near computer motherboard in this illustration taken January 8, 2024. REUTERS/Dado Ruvic/Illustration