Meta Announces New AI Chip That Speeds Up Model Training Process






Several new artificial intelligence (AI) chips were announced this week. Yesterday from Google and today from Meta. This second generation Meta Training and Inference Accelerator (MTIA) chip is designed to speed up the training process of AI models especially in making inferences.



Previously the MTIA V1 chip was announced in 2023 and the new MTIA V2 is both now in the process of being produced with V1 going into use from 2025. In terms of performance MTIA V2 operates at a speed of 1.3 GHz with 256MB of built-in memory which is an improvement over MTIA V1 which operates at a speed of 800 MHz and a built-in memory of 128GB. There is also a difference in technology as the V1 uses TSMC's 7nm technology while the V2 uses TSMC's 5nm.


In the tests that have been done, the V2 can perform the training process up to three times hotter than the V1. Major companies such as Meta, Intel, Samsung and Google are releasing standalone AI chips to reduce reliance on NVIDIA hardware. Last month Intel, Google, Arm, Qualcomm, Samsung and several other technology companies established The Unified Acceleration Foundation (UXL) as an effort to counter NVIDIA's dominance in the world of AI.

Previous Post Next Post

Contact Form