Meta has partnered with Arm to build a new type of CPU made for AI workloads. The two companies have jointly introduced the Arm AGI CPU, a processor built to handle the demands of modern AI data centers.
Meta says its data centers are already hitting the limits of regular processors, especially as it scales up its AI models and services. This partnership is a direct response to that problem.
What the Arm AGI CPU Offers
The Arm AGI CPU is built to deliver better performance per rack without consuming more power. Arm claims the chip offers more than 2x the performance per rack compared to older x86 CPUs, while keeping power usage in check.
The chip comes with up to 136 Arm Neoverse V3 cores. It also features high memory bandwidth and low latency, and is designed to handle continuous workloads without slowing down. For large deployments, configurations can scale to tens of thousands of cores per rack using advanced cooling.
Why This Chip Matters Now
A big reason behind this development is the rise of agentic AI, where systems can carry out tasks, make decisions, and coordinate actions on their own. This kind of AI needs CPUs that can handle orchestration, data movement, and control tasks alongside GPUs. Arm believes this shift could lead to a more than fourfold increase in CPU demand inside AI data centers over time.
Meta’s Direct Role in Building the Chip
Meta is not just a customer here. The company is a co-developer of the Arm AGI CPU and plans to use it alongside its own custom silicon, the Meta Training and Inference Accelerator (MTIA). Meta also confirmed it will share its board and rack designs for this CPU through the Open Compute Project later this year.
For Arm, this launch marks a shift from just licensing chip designs to delivering full silicon products for data centers. The company says this gives partners more flexibility to deploy Arm-based infrastructure, either by licensing designs or using ready-built chips.
Availability
The Arm AGI CPU already has support from cloud providers and hardware vendors. Early systems are available through partners right now, with broader availability expected later this year.
With this partnership, Meta and Arm are aiming to build infrastructure that can handle the next phase of AI growth, where performance, efficiency, and scalability all matter equally.
