Meta today shared its plans to develop up to four of its own chips, further optimizing operations and costs involving processing chips. These processing chips are seen to be more focused on inference, which is the processing of AI models in providing responses to users.
Meta said that these chips are under the Meta Training & Inference Accelerator (MTIA) program, and the earliest chip will be known as the MTIA 300. These four chips will be introduced in stages, starting this year until 2027. New versions of the chip are expected to be introduced every six months.
Meta also stated that the MTIA 400 is designed for use in data centers. As many know, by managing Facebook, Instagram, Threads and WhatsApp, Meta is constantly expanding their data centers, and often needs additional processing capacity.
This is in addition to the integration of artificial intelligence that requires additional processing and inference. Meta will also collaborate with Broadcom for some elements of the chip design, and use TSMC for production.
In addition to developing its own chips, Meta has also previously announced collaborations with AMD and NVIDIA. The rapid development of technology has also led many internet giants to start developing their own solutions. Previously, Microsoft and Google have also announced the development of their own chips to power processing and operations involving data centers and artificial intelligence.

