How Arm is Driving the Next Wave of Robotic Innovation

Robots are no longer confined to factory floors or sci-fi movies—they’re part of everyday life. Maybe it’s your robotic vacuum quietly cleaning the living room, or the self-checkout kiosk guiding you through your grocery shopping. These aren’t novelties anymore; they’re signs of a broader shift.
Driven by advances in AI and machine learning, robots are evolving from rigid task-doers into adaptive, decision-making collaborators. They can perceive their environments, respond in real time, and even communicate using natural language. This growing intelligence is blurring the line between machine and assistant, opening new possibilities for human-robot collaboration.
As this technology matures, robots are poised to play a larger role across sectors—from co-bots on the factory floor to autonomous systems in healthcare, logistics, and beyond. AI is not just making robots smarter—it’s making them more human-compatible, enabling them to perceive, decide, and act in ways that enhance productivity, touching nearly every aspect of our lives.
Arm at the center of the physical AI evolution
The world is entering a new phase of AI innovation—one where intelligence doesn’t just reside in the cloud but is embedded in machines that perceive, decide, and act in the physical world. This is the era of agentic AI, where systems don’t wait for commands but take proactive steps, adapt in real time, and collaborate with other machines. It’s also the rise of physical AI—where that same intelligence is integrated into devices that can navigate, sense, and manipulate the environment.
At the heart of this shift is Arm. As the leading provider of energy-efficient, scalable compute, we’re enabling the fusion of AI and robotics at the edge, where decisions must be made instantly and power budgets are tight. Whether it’s a robot patrolling a train station, a smart door lock recognizing a face, or a humanoid navigating debris, our compute fabric is the common denominator bringing agentic intelligence to life.
R2C2’s smarter fleets and multi-robot autonomy
R2C2’s AI platform is like Android for robots—an interoperable OS for heterogeneous fleets. In Hong Kong, R2C2-powered robot dogs inspect train cars autonomously, analyzing thousands of visual data points and delivering over 99% inspection accuracy. This system runs on NVIDIA Jetson modules powered by Arm CPUs, handling low-latency operations like control and inspection while offloading heavier AI tasks to GPUs.
R2C2 is solving some of robotics’ hardest problems: interoperability between different robot types, lack of usable application software, and fleet scalability. With Arm at its core, it’s creating plug-and-play systems that thrive in environments where connectivity, power, and latency constraints would otherwise cripple performance.
Motion meets intelligence with Deep Robotics
Deep Robotics’ quadrupeds are redefining mobile agility in hazardous terrains—from industrial tunnels to post-disaster sites. Built on Arm-powered Rockchip RK3588 SoCs, these robots deliver high-performance motion control while consuming just over 10 watts—one-third the power of legacy x86 systems.
This leap in efficiency enables smaller batteries, simpler cooling, and longer runtime. Whether patrolling a power station or advancing toward a humanoid platform, every step taken by Deep Robotics is guided by Arm CPUs—offering the precise, scalable compute needed for embodied intelligence in real-world conditions.
Intelligence at the edge with Beken’s AI SoCs
Beken’s BK7259 chip brings ultra-fast, ultra-efficient AI to everyday devices—from toys to smart locks. Powered by Arm Cortex-M cores and the Ethos-U65 NPU, the chip delivers local facial recognition in under 200 milliseconds, all while drawing minimal power—ideal for battery-operated edge AI.
Beken’s integration of Arm’s AI toolchain, including CMSIS-NN and Vela, accelerates deployment and development. With real-time audio processing, robust security features, and seamless Wi-Fi connectivity, Beken’s AI platforms prove that powerful intelligence doesn’t have to come at the expense of energy efficiency or cost.
Affordable agility with UCR’s humanoid robotics
Under Control Robotics (UCR) is taking a different approach—building rugged, affordable humanoid robots for dangerous work in construction, mining, and energy. Their prototype, “Moby,” walks, balances, and adapts using a lightweight AI model trained in simulation and deployed on an Arm-powered Jetson Orin module.
The system leverages Arm Cortex-A78-class CPUs for real-time motion logic and STM32 controllers for motor actuation. With this setup, UCR achieves performance previously reserved for high-cost systems, while using inexpensive sensors and minimal power. It’s a prime example of how Arm enables edge-native, AI-driven robotics for harsh environments—without the bulk or power demands of traditional architectures.
The future with Arm as the physical AI backbone
As we transition into the physical AI era, robotics is no longer a field of research—it’s a platform for scalable, real-world deployment. What unites these diverse innovators—from industrial quadrupeds to AI toys and humanoid laborers—is their reliance on Arm’s heterogeneous compute. By combining power efficiency, flexibility, and scalable AI performance, Arm provides the foundational compute layer making agentic and physical AI not only possible, but practical.
The future of robotics is autonomous, adaptive, and embedded. And wherever that future takes us—into smart cities, hazardous sites, or our everyday lives—Arm will be at the center, powering intelligence in motion.
Any re-use permitted for informational and non-commercial or personal use only.