Top 5 trends you can expect from CES 2026 in Las Vegas
The year 2025 marked a turning point for artificial intelligence (AI). What once felt experimental is now appearing in cars, consumer electronics, intelligent home systems and the next generation of robotics. CES 2026 arrives at a pivotal moment, when intelligent systems are becoming smarter, faster and more integrated into our daily lives.
At this year’s CES, these will be the defining forces in AI:
- Physical AI where vehicles, robots and machines see the world, understand it and safely act in real environments.
- Edge AI where intelligence moves closer to the user, powering responsive, private and personalized experiences on everyday devices.
- And, in some cases, these forces are combining. Extended reality (XR) tools are training grounds for physical AI. Robots learn from virtual twins. Wearables anticipate your needs. Devices adapt on the fly.
Across it all, the Arm compute platform does the heavy lifting, powering the intelligence that helps machines perceive, reason and act. As we gear up for CES 2026 in Las Vegas, here’s your guide to the top AI and compute innovations to watch for..
Autonomous experiences take a decisive step forward
Automakers are increasingly moving from software-defined vehicles (SDVs) to AI-defined platforms, where real-time perception, prediction and split-second decision-making are treated as foundational capabilities. The pressure comes from drivers who want more situationally aware assistance, and smart cities looking for safer, more efficient mobility.
You’ll see this in the autonomy stacks now coming to market. For instance, Rivian’s in-house autonomy platform, featuring a custom chip built on the Arm compute platform, is designed to unlock a wide range of advanced autonomy capabilities across future vehicles and other autonomous systems.
Tesla’s next-generation AI5 chip, built on the Arm compute platform, is delivering up to 40x faster AI performance than the prior generation, underlining how efficiency and scalability are becoming central requirements as physical AI expands beyond traditional driver-assistance. Elsewhere, the Arm-based NVIDIA DRIVE Thor platform is powering the Lenovo HPC 3.0 system behind WeRide’s Level 4 Robotaxi GXR, while autonomy companies such as Nuro, Wayve, and Zoox continue to refine their services within defined operating zones.
And, make sure to stop by the NVIDIA DRIVE AGX Thor Developer Kit showcase, where you’ll see how Arm-based platforms enable a seamless path from early development to scalable, production-ready automotive systems. Meanwhile, digital twins demos and virtual platforms, including Siemens’ new PAVE360™ Automotive technology which is used by multiple companies you will find at CES 2026, are speeding the path to integration. That broader push is reflected in Arm’s work with Amazon Web Services (AWS), and Google Cloud.
Additionally, HERE Technologies will highlight how Arm-based AWS Graviton infrastructure is helping to move work from the cloud toward production more efficiently.
Robotics embodies a new phase of physical AI
CES 2026 is where you’ll see robots move beyond the lab, driven by advancements in AI models, sensing, actuation, low-power compute and energy-efficient system integration that make autonomous deployment more practical at scale.
For instance, look out for DEEP Robotics’ LYNX M20 Pro, a quadruped designed for industrial inspection and emergency operations. It can traverse rugged terrain that would challenge conventional wheeled robots. Service robots are becoming a familiar sight as well, with companies like Roborock and PUDU Robotics demonstrating cleaning, delivery and hospitality robots capable of navigating busy, unpredictable indoor spaces.
Humanoids are also moving beyond the conceptual stage. Several companies, including Agility Robotics, AGIBOT and Galbot, have made tangible progress in locomotion, balance and manipulation, with thousands of units already produced and deployed in commercial environments. Their presence at CES 2026 will help clarify how these use cases are beginning to take shape, particularly across manufacturing, retail, and warehouse operations.
Platforms such as NVIDIA’s Jetson Thor, running on Arm-based compute, show how simulation is evolving into deployment-ready systems.
On-device AI becomes standard on PCs, laptops, and tablets
Through the halls of CES 2026, you’ll notice how intelligence is increasingly moving onto the consumer devices people use daily, with the Arm compute platform making it fast, efficient and always-on. Windows on Arm is a good indicator of that trend,with over 100 models planned for 2026 across every major OEM. This, coupled with expanding native apps support, demonstrates how Windows on Arm has moved into the mainstream of the AI PC market.
Meanwhile, Apple’s M-series MacBooks and Google Chromebooks have proved that strong performance and long battery life can co-exist, a balance that becomes even more important as on-device AI becomes the standard, handling tasks like translation, image enhancement and meeting summaries without relying on cloud connectivity.
If you’re a tablet fan, check out devices like Xiaomi’s Pad 7 Ultra, built on the Arm compute platform. It will point how high-performance mobile silicon is scaling into tablet devices, while still showcasing exceptional responsiveness and long battery life.
A similar pattern is emerging in new AI workstations. The NVIDIA DGX Spark, introduced at CES 2025 and already in the hands of developers, returns to CES 2026 as it powers a variety of new AI workstations from leading OEMs. The compact machine uses a GB10 Superchip with 20 Arm cores to run large models locally. Demos at the Arm CES 2026 area will focus on three workloads:
- Local inference of 120-billion-parameter language models;
- Local image generation through ComfyUI workflows; and
- Workloads that previously required cloud instances running instead on a silent, secure, USB-C-powered desktop.
Smarter wearables and everyday devices
AI at the edge is becoming most visible in the devices people wear and use every day, with wearables at CES 2026 showing how intelligence is becoming more personal and portable. For example, Meta’s latest Ray-Ban smart glasses will show how AI can fit into daily routines without demanding attention, offering hands-free capture, voice interaction and discreet visual cues. Paired with a wrist-worn Neural Band control device built on Arm CPUs and Ethos-U55 NPUs, the system runs continuous spatial AI and on-device inference while staying within strict power and thermal limits.
Meanwhile, Arm enables a variety of XR reference designs that blend spatial audio, computer vision and gesture input inside compact, battery-powered wearables.
Health wearables are evolving from step counters to daily companions that can interpret patterns and offer timely guidance, particularly around healthcare. The Oura Ring Gen 4, built on Arm-based silicon, is a small device that analyzes sleep, stress and recovery of users throughout the day, while lasting several nights without charge.
What’s common across all these wearable devices is that they need to run real-time intelligence within limited power and thermal budgets. Many already rely on Arm’s low-power compute to handle continuous sensing and on-device inference while keeping data local.
Smarter homes and connected things
In 2026, smart-home systems will get smarter and more local. CES 2026 will show how connected systems, lights, cameras and thermostats, are evolving to meet new demands for energy, privacy and reliability. The next phase will be about making those devices work together in more useful way, with three forces at play:
- Higher energy costs and climate goals that are encouraging more control over heating, cooling and appliance usage;
- Privacy concerns that are making people wary of sending every video frame and voice clip to the cloud; and
- Consumer reliability expectations as families rely on connected devices for security and comfort.
Most of the intelligence is moving onto local devices and hubs. Smart displays and hubs, such as Google Nest systems, are increasingly handling AI tasks like presence detection, local voice control, and automation flows on the device itself. Smart TVs from Samsung, LG, TCL, and Hisense are also acting as home control surfaces, integrating media with the management of devices that support standards, such as Matter.
Most sensors, cameras, thermostats, and home hubs rely on Arm-based microcontrollers or application processors, as they offer the mix of compute that these devices need, including power consumption and reliable performance, as well as a mature software ecosystem. Meanwhile, larger home devices, like today’s smart TVs and displays, are built on the Arm compute platform to manage video and connectivity tasks, while helping to increase the volume of AI workloads processed directly on the device.
Finally, at CES 2026, new Arm-powered endpoint platforms such as Alif’s Ensemble E8 – which we will showcase at the event – will hint at how ambient AI workloads will soon run on ultra-low-power nodes at the very edge.
CES 2026: Arm powers the future of intelligence everywhere
By the time the Las Vegas halls empty out, the lasting impression left by CES 2026 will showcase the shift in how advanced intelligence can travel with people, adapt to their routines, and shape how their devices sense, decide and act. While AI will show up differently in a robotaxi than in a laptop or a health wearable, the common thread will be how the technology enables extraordinary experiences reliably, efficiently and exactly where it’s needed.
CES 2026 will reveal a future where intelligence is seamlessly embedded across the entire technology landscape – a future that is built on Arm.
Any re-use permitted for informational and non-commercial or personal use only.






