ARTICLE AD BOX
![]()
Facebook-parent company Meta is the latest tech giant to announce that it is working on custom chips to tackle AI tasks – training and inference. The social media powerhouse has revealed a suite of four in-house AI chips designed to power its massive data centre expansion.
With this move, Meta is now placed alongside rivals like Google, Microsoft and Amazon, who have all developed their own specialised chips to reduce their reliance on expensive and supply-hit hardware from vendors like Nvidia and AMD.Meta is moving fast as the company plans to release new versions of its Meta Training and Inference Accelerator (MTIA) chips every six months. The company said that it has developed “a competitive strategy” that prioritise “rapid, iterative development, an inference-first focus, and frictionless adoption by building natively on industry standards.”
Why Meta will launch new chip every 6 months
Meta says that the reason to launch a chip every 6 months is to adapt to evolving AI techniques and technologies. Furthermore, it says that the focus will be on inference rather than carrying out most demanding workloads.“While the industry typically launches a new AI chip every one to two years, we’ve developed the capacity to release ours every six months or less by building on our modular, reusable designs. This accelerated pace enables us to quickly adapt to evolving AI techniques, adopt the latest hardware technologies, and minimize costs associated with developing and deploying new chip generations,” the company explained.
“Mainstream chips are typically built for the most demanding workload — large-scale GenAI pre-training — and then applied, often less cost-effectively, to other workloads like GenAI inference. We take the opposite approach: MTIA 450 and 500 are optimized first for GenAI inference, and they can then be used to support other workloads as needed, including ranking and recommendations training and inference, as well as GenAI training.
This keeps MTIA well-tuned to the anticipated growth in GenAI inference demand,” the company pointed out.
One of Meta’s in-house chips are already deployed
MTIA 300 has already been deployed and handles “ranking and recommendation” tasks – the engine that decides which ads and posts you see on Facebook and Instagram. Meanwhile, MTIA 400, 450, and 500 are designed for “generative AI,” such as creating high-quality images and videos from text prompts. Meta says testing on the 400 model is already complete, with the others expected to be operational by 2027.While Meta recently inked massive multi-year deals to buy millions of GPUs from Nvidia and AMD, the company wants more “leverage.” Yee Jiun Song, Meta’s VP of Engineering, told CNBC that custom chips allow the company to “squeeze more price per performance” across its fleet. “This provides us with more diversity in terms of silicon supply and insulates us from price changes to some extent,” Song told CNBC.
English (US) ·