Giveaway: SUBSCRIBE our youtube channel to stand a chance to win an iPhone 17 Pro

Microsoft Launches Maia 200 Chip for AI Inference Tasks



Microsoft launched its latest chip Maia 200 as an update to the Maia 100 that was launched in 2023. It is a 3nm chip designed specifically for artificial intelligence (AI) inference tasks with a capacity of 10 petaflops (FP4) and 5 petaflops (FP8).


The inference chip is used to run models that have already been trained. As such, the Maia 200 is not a competitor to the Rubin series chips that NVIDIA just announced at CES earlier this month. Instead, the Maia 200 is a competitor to Google Tensor and Amazon Trainium used in cloud data centers.


Microsoft claims that the Maia 200 offers three times the FP4 performance of the third-generation Amazon Trainium and higher FP8 performance than Google's seventh-generation TPU.


Despite being just announced, the Maia 200 is already being used in their data center in Des Moines, Iowa to run Copilot 365, GPT-5.2 and by their AI Superintelligence team. This chip will then be used in a data center located in Phoenix, Arizona.


The Maia chip was developed in-house by Microsoft to reduce its dependence on NVIDIA, which in turn reduces the cost of developing data centers that require hundreds of billions.

Previous Post Next Post

Contact Form