The evolution of physical AI: From controlled environments to the real world
Physical AI is moving machines beyond the predictable, controlled environments and into the complexity of the real world. Where robots were once designed for precision and repetition on factory floors, they are now being built to sense, reason, interpret, and respond to dynamic surroundings. This shift is also reflected at a macro level, with AI-driven productivity gains projected to increase global GDP by around 4% over the next decade.
Advances in AI are enabling physical AI systems to understand what they see, grasp context, and adjust their behavior within milliseconds. Whether navigating in warehouses, assisting in hospitals or moving on the road, autonomous machines are making decisions based on real-time conditions instead of fixed sequences.
Arm has supported the development of physical AI systems for years, starting with fixed machines on the factory floor. That same foundation is now enabling the next generation of intelligent robots and autonomous machines that can operate in real-world environments and respond in real time.
Physical AI in the real world
The evolution of physical AI is clear when looking at the machines being built today. Across different form factors, robots and other autonomous machines are beginning to operate with greater awareness, adaptability, and independence.
Next-generation humanoid robotics
Advances in physical AI are being seen in more complex, human-like systems. At Arm’s Cambridge headquarters, AGIBOT highlighted how far humanoid robotics have progressed. Robots demonstrated dexterous control and navigated complex environments with fluid motion, combining perception, reasoning, and control in real time.
These physical AI systems are designed to operate in human environments. They must understand space, interpret intent, and execute actions with precision, whilst ensuring the safety of people around them. This places significant demands on compute, as multiple workloads such as vision processing, motion planning, and AI inference must run simultaneously within tight power and thermal limits.
The Arm compute platform supports these requirements by enabling efficient processing across these workloads, enabling humanoid systems to operate responsively and safely in real-world settings.
Quadrupeds and industrial robotics
Quadruped robots represent another important category of physical AI, particularly in environments where terrain is unpredictable and, at times, unsafe.
Robots developed by companies such as Deep Robotics are designed for inspection, exploration, and emergency rescue. They can navigate uneven ground, climb obstacles, and maintain stability in changing conditions. These capabilities rely on continuous perception and real-time control, supported by efficient compute.
Similarly, platforms like the PUDU D5 extend autonomous mobility into industrial environments. Designed for inspection, patrol, and logistics support, the D5 Series operates across large sites and uneven terrain using LiDAR and camera-based vision. This is particularly valuable in environments that are hazardous, remote, or difficult for people to access, where robots can support operations while improving safety and reducing risk.
To support this, the system uses a heterogeneous compute architecture that distributes workloads across perception, planning, and control. Sensor data is processed continuously, allowing the robot to interpret its environment and respond with low latency.
Processors built on the Arm compute platform support these core functions, working alongside AI accelerators to deliver efficient performance at the edge. This enables robots to operate independently in environments where reliability, safety and energy efficiency are critical.
The same shift is also visible in industrial automation. Collaborative robots on factory floors are becoming more responsive to changing workflows, working alongside people and adapting to new tasks without requiring fully fixed configurations.
Boston Dynamics’ Spot is another example of how mobile robots are being deployed in industrial settings for inspection and remote operation, where real-time perception and control are essential.
Autonomous vehicles and mobility platforms
Physical AI is also transforming autonomous mobility, where systems must operate and navigate safely in complex, real-world conditions.
Autonomous robotaxis, such as those developed through the collaboration between Lenovo and WeRide, demonstrate how scalable compute platforms built on Arm are enabling Level 4 autonomy. These systems process large volumes of sensor data, including cameras and LiDAR, to make real-time driving decisions.
At the same time, the Arm and Tensor partnership highlights how next-generation compute platforms are being designed to support AI-driven mobility. These combine high-performance compute with energy efficiency, enabling real-time perception, planning, and control in autonomous systems.
Arm’s work with Rivian also shows how custom autonomy platforms are enabling vehicles to interpret their environment and make real-time driving decisions at scale. In these environments, reliability and latency are critical. Decisions must be made instantly, and systems must operate consistently over long periods. Efficient, scalable compute plays a central role in making this possible.
The building blocks of intelligent systems
At the core of physical AI is a continuous loop between sensing, decision-making, and action. Systems must process inputs from sensors, interpret that data, and trigger responses within milliseconds. In many cases, this latency between perception and action becomes a defining requirement, particularly in environments where safety and timing are critical.
The evolution of physical AI is rooted in how modern robotic and autonomous machines systems are being engineered. They bring together several core capabilities that operate as a coordinated system. Each layer contributes to how the machine understands and interacts with the world.
Perception provides environmental awareness. Cameras, LiDAR, and sensor arrays generate a continuous stream of data, allowing the system to detect objects, estimate distance, and map its surroundings. Edge AI inference processes this data locally. By running AI models on-device, robots can respond instantly without waiting for cloud input. This is critical in environments where latency affects safety or performance.
Multimodal reasoning combines inputs such as vision and language. Robots and other autonomous machines can interpret a scene, understand a command, and decide on the appropriate action. This brings interaction closer to how humans communicate and operate. Real-time control, safety and security ensure that decisions are executed reliably. Deterministic compute allows systems to respond within predictable time frames, while safety and security mechanisms help manage risk in complex environments.
These capabilities are already being applied across industries. In smart factories for instance, predictive maintenance systems analyze equipment data continuously to detect early signs of failure. Meanwhile, in physical AI deployments, systems are designed to process real-world inputs and act with minimal delay.
Enabling physical AI with efficient, scalable compute
Today’s physical AI systems must process high volumes of sensor data, run AI models, and control movement, all within tight power and thermal limits. Many systems operate on batteries with power limits, which places a strong emphasis on efficiency. At the same time, performance must remain consistent, especially in environments where timing and accuracy are critical.
To support this, today’s compute platforms must deliver:
- Deterministic performance for real-time decision-making;
- Energy efficiency for sustained operation in constrained environments;
- On-device AI inference to reduce latency;
- Heterogeneous compute to manage diverse workloads; and
- The ability to scale across different types of physical AI systems.
The Arm compute platform is designed around these principles. Because of this, it’s used across the full spectrum of compute within physical AI systems, from low power microcontrollers that process sensor data to high-performance central compute (essentially the “brain” of physical AI systems) that handle complex AI workloads. This allows engineers and developers to build systems where each component is optimized for its role, while still operating on a consistent architecture.
For example, in robotics, this approach enables a balance between performance and efficiency. A robot can process sensor inputs, run AI inference efficiently, and execute control logic without exceeding power or thermal limits. This is essential for systems that need to operate continuously in real-world conditions.
Arm’s ecosystem also plays a role in accelerating development. By working with partners across hardware and software, and supported by a global ecosystem of over 22 million developers, Arm enables a wide range of physical AI platforms, from machines on the factory floor in fixed environments to humanoids and quadrupeds operatingin real world environments.
Shaping the future of intelligent machines
Physical AI is shaping how machines interact with the world. Robots and other autonomous machines are being designed with a deeper understanding of their environment, supported by advanced AI that allows them to interpret, decide, and act. This is expanding their role across industries, from interactive systems to industrial automation and autonomous mobility.
As these systems continue to evolve, the balance between performance and efficiency will determine how widely they can be deployed. Arm has supported the development of physical AI systems for years, starting with fixed machines on the factory floor. That same foundation is now enabling the next generation of intelligent robots and autonomous machines, where AI operates directly in the physical world and systems are built to respond in real time.
As physical AI continues to scale, it will increasingly be built on Arm.
Powering intelligent physical AI on Arm
Arm enables safe, real-time, energy-efficient AI systems that sense, decide, and act reliably in the physical world.
Any re-use permitted for informational and non-commercial or personal use only.






