May 2025: Arm Innovations Shaping the Future of AI, Cloud, and Edge

Every breakthrough in compute brings us closer to a more intelligent, efficient, and connected world.
At Arm, these breakthroughs are happening every day, driven by developers, engineers, and partners working across AI, cloud, edge, and embedded systems. In May 2025, we saw major milestones in server infrastructure, developer tooling, and on-device AI.
From Korea’s first Arm-based datacenter servers to faster CI/CD workflows on Windows, the pace of innovation continues to accelerate. This roundup features eight key highlights from across the Arm ecosystem, each one showing how scalable, high-performance, and energy-efficient computing is transforming industries and experiences worldwide.
Korea’s first Arm-based server pioneer speaks out
FunFun Yoo, CEO of XSLAB Inc., is reshaping the future of datacenter infrastructure. Yoo shares how XSLAB became Korea’s first manufacturer of Arm-based servers—and how their in-house BMC technology is tackling the challenges of remote server management head-on. As AI and cloud computing drive up power demands, learn why scalable, energy-efficient solutions like Arm are key to building a smarter, more sustainable digital future.
Bringing audio generation to everyone with Arm and KleidiAI
Generative AI is expanding from images and text to audio, enabling developers to create music and sound effects directly on devices without relying on the cloud. This evolution, led by models like Stability AI’s “Stable Audio Open”, enhances responsiveness, privacy, and energy efficiency—key benefits for mobile apps and smart devices. As demand grows for running AI at scale, Arm’s high-performance, energy-efficient compute platform makes it possible to deploy generative models both in the cloud and at the edge, paving the way for a faster, more sustainable AI-powered future.
Gian Marco Iodice, Principal Software Engineer, explains how Arm’s KleidiAI performance libraries accelerate generation speeds by up to 30x on its processors. This means developers can build custom audio tools that are both fast and efficient on mobile and edge platforms. By enabling high-performance generative AI to run locally, Arm is helping seamlessly integrate creative, intelligent audio experiences into everyday devices.
As AI demand surges across everything from datacenters to smartphones, Arm provides the essential foundation making it all possible. By working with innovators like Stability AI, Arm ensures that advanced models can operate seamlessly across diverse hardware environments.
How AI really works
This short, beginner-friendly video breaks down the core concept of neural networks—the foundation of AI models like ChatGPT and image recognition systems. It offers a clear, high-level explanation of how data flows through layers to help machines make decisions. Whether you’re new to AI or just want a better grasp of how it works, this is the perfect place to start.
LLVM 20 delivers new features and enhancements
The LLVM compiler infrastructure—a key open-source toolchain used across modern software development—has reached a major milestone with the release of LLVM 20. This update brings full support for Arm’s latest Armv9.6-A architecture, including advanced features like SVE2.1 and SME2.1 for high-performance vector and matrix processing.

Early benchmarks show significant gains across workloads like SPEC2017, with clear benefits for AI, HPC, and embedded applications. Volodymyr Turanskyy, Principal Software Engineer, talks about how these enhancements enable developers to write more efficient C/C++ code that directly targets leading-edge Arm hardware. By aligning LLVM updates with Arm architecture roadmaps, Arm ensures developers can fully tap into the latest performance and efficiency capabilities from day one. As compute demands grow, tools like LLVM 20 are essential in unlocking the full potential of Arm-based systems across industries.
Arm showcases innovation at Linaro Connect 2025
At Linaro Connect, developers and tech leaders convened to push the boundaries of open-source software on Arm-based platforms—technology increasingly central to the future of cloud, AI, and edge computing. The conversation centered on performance, energy efficiency, and scalability—three critical pillars as digital infrastructure demands grow. Notably, close to 50 percent of the compute shipped to top hyperscalers in 2025 will be Arm-based, while services like SAP HANA Cloud on AWS Graviton benefit from up to 30% improved performance, showcasing the practical, enterprise-level gains of Arm technology.
As Eric Sondhi, Senior Manager, Developer Marketing Strategy, highlights, the platform’s flexibility and ecosystem support make it a leading choice for those building the future of AI and cloud-native innovation. By enabling high-performance, low-power compute across a wide spectrum of devices, from cloud servers to edge AI endpoints, Arm provides the underlying architecture for next-gen workloads. Through open-source libraries like Arm NN and the Arm Compute Library—now deployed in over a billion devices—Arm empowers developers to build efficient machine learning applications at scale.
How AuthZed builds smarter authorization with Arm
As cloud-native development expands, ensuring secure, consistent access control from developer workstations to live environments becomes a top priority. AuthZed addresses this challenge with a modern authorization platform built for the cloud, enabling faster development cycles, reduced costs, and smoother collaboration across teams. This kind of infrastructure is critical in a world where scalable and secure software delivery is no longer optional but essential.
By leveraging Arm-based cloud infrastructure, AuthZed saw a 40% improvement in build times, a 20% drop in compute costs, and a 25% gain in CPU efficiency. These results highlight how Arm’s energy-efficient architecture and growing presence across major cloud providers are reshaping how developers think about performance and scalability. Arm is helping drive the future of secure, cost-effective cloud-native development by delivering the compute foundation for next-generation DevOps and platform engineering tools.
GitHub and Arm team up to revolutionize Windows developer experience
GitHub Actions runners for Windows on Arm mark a major step forward in continuous integration and delivery (CI/CD) for developers building AI and cloud-native applications. By enabling workflows to run natively on Arm-powered Windows devices, these runners reduce reliance on emulation, speed up testing cycles, and better reflect real-world deployment environments—making development faster, simpler, and more energy-efficient.
Pareena Verma, Solutions Director and Distinguished Engineer, explains how Arm’s collaboration with GitHub brings these capabilities to life, helping developers harness the full performance and efficiency of Arm-based Windows systems. With native CI/CD tools now available, Arm is supporting a more seamless, scalable development experience—accelerating innovation across edge, cloud, and AI ecosystems.
Any re-use permitted for informational and non-commercial or personal use only.