Arm delivers record-breaking quarter and full-year results
Today (May 6, 2026) Arm (NASDAQ: ARM) published a letter to its shareholders containing the company’s results for its fourth quarter and fiscal year 2026 ended March 31, 2026. The infographic below provides the key highlights for the quarter:

- Arm delivered record-breaking quarterly and full-year results, with Q4 FYE26 revenue reaching $1.49 billion and full-year revenue at $4.92 billion. FYE26 was Arm’s third consecutive financial year since going public of more than 20% revenue growth.
- Arm delivered record full-year royalty revenue of $2.61 billion alongside Q4 FYE26 revenue at $671 million, with this driven by growth across smartphones, Edge AI, Physical AI and Cloud AI, where data center royalties more than doubled year-over-year. Meanwhile, Arm’s Q4 FYE26 revenue for licensing reached a record $819 million, with yearly revenue at $2.31 billion.
- At the Arm Everywhere event on March 24, Arm introduced the Arm AGI CPU, its first production silicon product purpose-built for agentic AI, which puts Arm at the center of the AI infrastructure that will define the next decade of computing. The Arm AGI CPU was developed in response to a clear customer need for a faster, more integrated way to deploy the Arm platform at data center scale. Here are the highlights:
- Arm AGI CPU will deliver more than 2x performance per rack compared with x86-based platforms, which allows the AI data center capital expenditure to be reduced by up to $10 billion per gigawatt.*
- Meta is the lead partner and co-developer, working with Arm on a multi-generation roadmap to support personal superintelligence for more than 3 billion users.
- Customer response to the Arm AGI CPU is already strong, with more than $2 billion of customer demand across FYE27 and FYE28 – more than double what was stated at Arm Everywhere.
- Cerebras, OpenAI, Positron and Rebellions are integrating the Arm AGI CPU alongside accelerator-based systems.
- Verda, a European AI cloud provider, recently announced that it will deploy the Arm AGI CPU for agentic AI orchestration.
- Commercial systems based on the Arm AGI CPU are now available to order from ASRock, Lenovo, Quanta, and Supermicro.
- The launch of the Arm AGI CPU marks a new chapter for the company: expanding the Arm compute platform into production silicon. Customers can now consume the Arm compute platform using IP, Compute Subsystems, and silicon, all while maintaining a common architecture and software ecosystem.
- More than 50 leading companies have supported the expansion of the Arm compute platform into silicon, including AWS, Broadcom, Google Cloud, Marvell, Microsoft, Micron, NVIDIA, Oracle, Samsung, SK Hynix, and TSMC.
- The momentum behind the Arm AGI CPU builds on Arm’s existing scale in the cloud, with Arm’s market share of CPU compute now representing about 50% share among top hyperscalers. Amazon, Google, and NVIDIA are already integrating Arm-based CPUs as head nodes alongside accelerator-based systems. Recent developments highlight this shift:
- Google announced that its replacing x86 host processors with custom Arm-based Axion CPUs, which provide up to 2x better performance-per-watt, for its next-generation TPUs – TPU8t for training and TPU8i for inference. As a result, the system-level performance improvements are significant, with TPU 8t delivering up to 2.7x better training performance-per-dollar, and TPU 8i delivering up to 80% better inference performance-per-dollar versus the previous generation built on x86.
- NVIDIA announced Vera, its next-generation Arm-based CPU purpose-built for agentic AI. Paired tightly with NVIDIA GPUs to maximize utilization, Vera delivers up to 50% faster performance and 2x higher efficiency than x86 CPUs.
- Microsoft is advancing Arm-based Cobalt CPUs, now deployed across a substantial portion of Azure regions and powering production workloads for customers including Databricks, Siemens, and Snowflake.
- AWS reported that its custom silicon business, including Arm-based Graviton alongside Trainium and Nitro, is now running at more than $20 billion annually and growing triple-digit year-over-year.
- Anthropic is using Trainium alongside tens of millions of Arm-based Graviton cores for generative AI workloads.
- For enterprises, Arm offers customers one architecture and one software ecosystem to move workloads seamlessly between the public cloud and private cloud to optimize performance, efficiency, data locality, and cost.
- SAP will move core database and business application workloads to Arm, starting with the AWS Graviton and expanding to run on internal Arm AGI CPUs.
- Cloudflare will deploy Arm across its global network to support traffic management, security and AI inference closer to users.
- There are also further Arm design wins with key network infrastructure providers, including F5 and SK Telecom.
- Beyond the data center, AI is moving into every device and physical system, with Arm already having the world’s broadest compute footprint. With over 350 billion chips shipped and over 22 million developers, the Arm compute platform is positioned to bring AI from cloud infrastructure to the edge and physical worlds with a common architecture and ecosystem.
The full letter to shareholders is available on the Arm investor relations website (https://investors.arm.com/financials/quarterly-annual-results). Information about the Arm earnings conference call can be found here.
Forward-looking statements
This blog post contains forward-looking statements regarding Arm’s future performance, market opportunity, and customer demand. These statements are based on current expectations and are subject to risks and uncertainties that could cause actual results to differ materially. For a discussion of factors that could affect Arm’s results, please refer to Arm’s filings with the U.S. Securities and Exchange Commission.
All product and company names are trademarks or registered trademarks of their respective holders.
* Based on Arm internal estimates.
Any re-use permitted for informational and non-commercial or personal use only.






