LAUNCHES

Meta Deploys Four New MTIA AI Chip Generations in Two Years

M megaone_admin Mar 28, 2026 2 min read
Engine Score 8/10 — Important

This story details Meta's significant progress in developing its custom AI chips, directly impacting its ability to scale AI experiences for billions of users. The information is highly reliable from a primary source and signals major strategic advancements in AI hardware.

Editorial illustration for: Meta Deploys Four New MTIA AI Chip Generations in Two Years

Meta has deployed four successive generations of its homegrown AI chips in just two years, expanding from recommendation systems to large language model inference as the company scales AI experiences for billions of users. The Meta Training and Inference Accelerator (MTIA) family, developed in partnership with Broadcom, now includes chips optimized for different AI workloads across Meta’s platforms.

The company has already deployed “hundreds of thousands of MTIA chips in production” and tested the hardware with large language models like Llama, according to Meta’s announcement. The rapid iteration cycle represents a departure from traditional chip development timelines, which typically span two years from design to production.

Meta published research papers detailing the first two MTIA generations at ISCA’23 and ISCA’25 conferences. The newer generations include MTIA 300 for ranking and recommendation training, MTIA 400 with a 72-accelerator scale-up domain for GenAI models, and MTIA 450 with doubled high-bandwidth memory (HBM) for GenAI inference optimization.

“AI models are evolving faster than traditional chip development cycles,” Meta stated in the blog post. “Rather than placing a bet and waiting for a long period of time, we deliberately take an iterative approach: Each MTIA generation builds on the last, using modular chiplets, incorporating the latest AI workload insights and hardware technologies, and deploying on a shorter cadence.”

MTIA 400 has completed lab testing and is scheduled for data center deployment, while MTIA 450’s HBM bandwidth exceeds “that of existing leading commercial products,” according to Meta. The company plans deployments through 2027 as it expands workload coverage from recommendation systems to general GenAI applications.

Share

Enjoyed this story?

Get articles like this delivered daily. The Engine Room — free AI intelligence newsletter.

Join 500+ AI professionals · No spam · Unsubscribe anytime

M
MegaOne AI Editorial Team

MegaOne AI monitors 200+ sources daily to identify and score the most important AI developments. Our editorial team reviews 200+ sources with rigorous oversight to deliver accurate, scored coverage of the AI industry. Every story is fact-checked, linked to primary sources, and rated using our six-factor Engine Score methodology.

About Us Editorial Policy