Arm and Meta power next era of AI for billions worldwide

16 Oct, 2025
Newsdesk
Cambridge headquartered superchip architect Arm has joined forces with Californian US giant Meta to enable richer, more accessible AI experiences for billions of people worldwide across the entire ambit of compute.
Thumbnail
Arm CEO Rene Haas. Credit – Arm.

Arm CEO Rene Haas believes the partnership is a tech game-changer. He said: “AI’s next era will be defined by delivering efficiency at scale. Partnering with Meta, we’re uniting Arm’s performance-per-watt leadership with Meta’s AI innovation to bring smarter, more efficient intelligence everywhere – from milliwatts to megawatts.”

The joint initiative aligns Arm’s leadership in power-efficient compute with Meta’s innovation in infrastructure, AI products and open technologies spanning AI technologies and data centre infrastructure.

The multi-year partnership builds on an ongoing hardware and software co-design effort between the companies. Meta’s foundational AI software technologies – including PyTorch – are now optimised for Arm, including PyTorch’s Executorch runtime using Arm KleidiAI to maximise performance-per-watt for Meta and the global open source community.

“From the experiences on our platforms to the devices we build, AI is transforming how people connect and create. Partnering with Arm enables us to efficiently scale that innovation to the more than 3 billion people who use Meta’s apps and technologies,” says Santosh Janardhan, Head of Infrastructure, Meta.

Meta’s AI ranking and recommendation systems – which power discovery and personalisation across Meta’s family of apps, including Facebook and Instagram – will leverage Arm’s Neoverse-based data centre platforms to deliver higher performance and lower power consumption compared to x86 systems.

Across its infrastructure, Arm Neoverse will also allow Meta to achieve performance-per-watt parity – underscoring the efficiency and scalability of Arm compute at hyperscale.

The companies worked closely to optimise Meta’s AI infrastructure software stack – from compilers and libraries to major AI frameworks – for Arm architectures. This includes the joint tuning of open source components, such as Facebook GEneral Matrix Multiplication (FBGEMM) and PyTorch, exploiting Arm’s vector extensions and performance libraries, producing measurable gains in inference efficiency and throughput. These optimisations are being contributed back to the open source community, to broaden their impact across the global AI ecosystem.

The partnership also deepens the collaboration on AI software optimisations across the PyTorch machine learning framework, the ExecuTorch edge-inference runtime engine, and the vLLM datacenter-inference engine, and looks to further improve on the foundation of Executorch now optimised with Arm KleidiAI, improving efficiency on billions of devices. Jointly, the collaboration will accelerate the ease of model deployment and increase performance of AI applications from edge to cloud.

These open source technology projects are central to Meta’s AI strategy – enabling the development and deployment of everything from recommendations to conversational intelligence.