Arm Newsroom Blog
Blog

How physical AI and edge AI defined CES 2026 in Las Vegas

Key takeaways from the CES 2026 show floor, from real-world physical AI to edge-first computing.
By Arm Editorial Team
CES 2026

The Consumer Electronics Show (CES) is the biggest technology events of the year. Every January, Las Vegas becomes a place where the technology industry shows what it’s building next, from headline keynotes to hands-on demos showcasing the latest technology products.

What stood out at CES 2026 was just how tangible and real AI had become. Across the show floor, AI was no longer limited to generating content or being present in the cloud. Instead, it appeared inside robots navigating physical spaces, vehicles making decisions locally, PCs running AI workloads on-device, and wearables interpreting sensor data in real-time.

Another element that made this CES different from the previous years was how consistently physical AI and edge AI showed up across multiple categories and booths. However, across automotive, robotics, PCs, XR wearables, and smart home devices, there was the same underlying requirement: AI needs to run locally, efficiently, and reliably.

This is why many of the technologies that attendees were noticing, touching and experiencing at CES 2026 were built on the Arm compute platform.

Here are the key takeaways from CES 2026.

Physical AI took center stage at CES 2026

Walking the CES 2026 show floor, physical AI was impossible to miss and marked a clear shift in how AI is being built and deployed.

Early in the show, NVIDIA’s keynote captured this moment by describing the rise and importance of physical AI. AI is no longer positioned as something that lives primarily in the cloud and now appears inside machines that interact directly with the real world: vehicles navigating complex environments, robots performing physical tasks, and autonomous systems designed to operate safely and independently.

The rise of robotics

Robotics was a major pillar of physical AI at CES 2026. Hyundai and Boston Dynamics unveiled the latest version of Atlas, while companies such as Robotera, Noetix, PUDU Robotics, and AGIBOT introduced humanoid and service robots aimed at logistics, manufacturing, and human-centric spaces.  The vast majority of robotics platforms rely on Arm technology to handle perception, motion planning, and control locally, where latency, power efficiency, and reliability are critical.

Elsewhere, and beyond robotics, BMW’s announced that its 2026 iX3 voice assistant will be powered by Alexa+, with this being one example of how AI is moving deeper into the in-car experience, running locally to deliver faster and more natural interactions.

The automotive and autonomous evolution

Automotive was one of the most visible examples of the physical AI shift. AI-defined vehicles emerged across mobility announcements, with intelligence increasingly running directly inside the car. NVIDIA’s CES 2026 keynote reinforced this shift, showing AI-defined capabilities in the Mercedes Benz CLA which runs on the Arm compute platform to support real-time perception, decision-making, and control.

The same AI-defined approach extended into heavy equipment, where Caterpillar and NVIDIA showcased at CES 2026 how Arm-powered platforms are enabling real-time perception and autonomy in construction and industrial machines. The collaboration also highlighted how physical AI is moving beyond vehicles into broader industrial mobility.

Meanwhile, Nuro, Lucid Motors, and Uber unveiled new self-driving Lucid Gravity SUVs, with this collaboration featuring Arm technology from cloud to car. Nuro Driver enables real-time AI at the edge on the Arm-based NVIDIA Drive AGX Thor, with the Lucid Gravity SUVs using the same Thor platform for safety-critical autonomy, and Uber’s fleet running on Arm Neoverse-based cloud infrastructure at scale.

The shift from pilots to production was especially clear in autonomous mobility, with Lenovo showcasing its L4 Autonomous Driving Domain Controller AD1 built on dual NVIDIA DRIVE AGX Thor systems, which is now being deployed by WeRide in its GXR robotaxi, the world’s first mass-produced L4 autonomous vehicle on this platform.

Meanwhile, Tensor Auto’s robocar showed how modern vehicles are increasingly relying on compute architectures, with over 100 Arm-based processing cores deployed across the vehicle through partner products such as NVIDIA, Renesas, NXP, and Texas Instruments.

During its CES 2026 keynote, Siemens outlined how AI is increasingly being embedded across digital twins, factory automation, and industrial control systems to bridge the physical and digital worlds. President and CEO of Siemens Roland Busch referenced the company’s new PAVE360 Automotive, a cloud-based digital twin which is powered by Arm Zena Compute Subsystems (CSS) to accelerate the development process. It’s clear that industrial and engineering workflows are a critical part of the physical AI shift.

As physical AI expands from perception and control into user-facing experiences, the need for efficient, scalable compute platforms only grows. That is why the Arm compute platform continues to underpin so many of the physical AI systems showcased at CES 2026.

As Drew Henry, EVP of Physical AI at Arm, says, “This era of physical AI, which we are at the very beginning of, is enabled by Arm because we are the unique company that is able to provide the computing technologies that enable safe and highly efficient systems which are built on top of an incredibly mature software ecosystem. So, when you look across CES, you can see numerous examples of devices, machines, robots, and cars that are powered by Arm.”

Edge AI became the default for consumer devices

CES 2026 showed that edge AI is well and truly visible in the mainstream. Across PCs, smartphones, wearables, and smart home devices, AI is being designed to run directly on the device to enable faster responses, offline operations, and lower power use.

On-device AI moves into mainstream in PCs and smartphones

Personal computing was one of the clearest indicators of the shift toward edge AI at CES 2026. That direction was most visible in systems powered by Qualcomm’s Snapdragon X-series platforms, which appeared across multiple OEM laptops at CES 2026, and provided one of the clearest directions of how new AI PCs are featuring local inference as a core capability. Many Windows on Arm systems were showcased running sustained on-device AI workloads while delivering multi-day battery life, a combination that is difficult to achieve without architectures designed for efficiency at scale.

Meanwhile, Chromebooks also reflected this shift toward edge-first AI. Several Arm-based Chromebooks designs showcased at CES 2026 – including the Acer Chromebook 315 and ASUS Chromebook CM32 – emphasized always-on responsiveness, long battery life, and local AI features that support everyday tasks.

Lenovo further reinforced this pattern with Qira, its new personal AI assistant designed to work across PCs and smartphones. Instead of treating AI as a cloud-only service, Qira was presented as a hybrid system that runs key tasks locally on devices, reducing latency while preserving personalization across endpoints. That cross-device approach shows a broader industry reliance on the Arm compute platform.

Smartphones followed the same trajectory. Samsung’s Galaxy Z TriFold received the “Best of CES 2026 Award” for its tri-fold design, which expands from a smartphone into a tablet-sized display. The new smartphone is built on the Arm-based Snapdragon 8 Elite.

Wearables and smart home devices bring edge AI into everyday life

Wearables and smart home devices at CES 2026 showed how edge AI is becoming ambient and always on.

Wearables are becoming more personal and always on and deliver AI insights in devices that are small, comfortable, and worn. This presents a a significant technical challenge, and one that depends on ultra-low-power computing like Arm. Ultra-low-power edge AI solutions from companies such as Ambiq and Alif Semiconductor, highlighted during CES 2026, showed how Arm-powered platforms are enabling always-on intelligence in wearables without compromising battery life.

The latest XR wearable devices also featured prominently at CES 2026, particularly in the form of lightweight AR smart glasses designed for everyday use. Companies such as XREAL and ThinkAR showed how on-device AI supports spatial awareness, gesture input, and contextual overlays directly on smart glasses, as well as a shift toward edge-first XR, where responsiveness, comfort, and battery efficiency depend on local AI processing built on Arm’s power efficient compute platform.

Meanwhile, in the smart home space, connected devices and displays leaned into on-device intelligence. Smart TVs from Samsung, LG, and Hisense, all showcased at CES 2026, continue to rely on Arm-based systems-on-chip (SoCs) to support voice interaction, personalization, and real-time content processing locally. Other consumer edge devices, from smart locks to pet and home automation products recognised in the “Best at CES 2026” award categories, further illustrated how AI is moving closer to the user.

Taken together, the edge AI announcements during the event highlighted that as AI becomes more personal, pervasive, and embedded in everyday products, running intelligence at the edge is essential for consumer devices. Across PCs, smartphones, wearables, and smart home devices, Arm-based platforms provide the efficiency, scalability, and ecosystem support needed to deliver AI experiences at global scale.

Chris Bergey, EVP of Edge AI at Arm, says, “Edge AI is going to enable a whole new class of experiences and devices that we are just starting to understand. Many of these examples are around XR and what we can do now with smart glasses and smart neural sensors that help control those sensors. But the one thing you can be sure about is that Arm will be at the center of these future AI platforms.”

The foundation beneath the next wave of AI

CES 2026 showed that AI is increasingly becoming part of the real-world environment. Intelligence is moving into devices, vehicles, and machines that must operate in real-time and at a global scale.  As this shift accelerates, success will hinge not only on raw performance, but on platforms capable of delivering energy efficiency, reliability, and long-term support across a wide range of product categories.

Arm delivers both unmatched energy efficiency and the world’s largest software developer base, making it the natural platform for building and scaling physical and edge AI systems globally. This explains why Arm underpinned so many of the technologies showcased at CES 2026 that are forming the foundation for the next wave of AI innovation.

Article Text
Copy Text

Any re-use permitted for informational and non-commercial or personal use only.

Editorial Contact

Arm Editorial Team
Subscribe to Blogs and Podcasts
Get the latest blogs & podcasts direct from Arm

Latest on X

promopromopromopromopromopromopromopromo