What GDC Festival of Gaming 2026 revealed about mobile gaming’s future
If you spent time at GDC Festival of Gaming 2026, you probably noticed a familiar theme running through many of the conversations around mobile graphics: How far can visual ambition really go on mobile without pushing devices past their limits?
At Arm’s developer summit at GDC, that question was tackled repeatedly in discussions around neural graphics, performance tuning, and the practical realities of building on mobile GPUs. Across a variety of sessions from Arm experts and leading gaming partners, like Enduring Games, Epic Games, Infold Games, and Sumo Digital, the focus was on the innovative techniques and optimizations that developers can apply today inside production workflows.
The goal is to deliver games with enhanced visual quality that run smoothly and efficiently on mobile devices.

Moving neural graphics from experimentation to new workflows
A clear theme from Arm’s developer summit was how to move neural graphics from experimentation into practical workflows.
One example was the announcement that Sumo Digital is collaborating with Arm to advance PC-class gaming on mobile using Arm neural technology. This collaboration helps accelerate the shift to Arm neural technology, adding dedicated neural accelerators to future Arm GPUs – from research to production – giving developers an opportunity to experiment early with AI-driven high-fidelity graphics techniques on mobile.
As Chris Bergey, EVP, Edge AI Business Unit, Arm, said: “With Sumo Digital, we’re doing important work to ease the on-ramp for developers so they can push visual quality even further on mobile.”
Since announcing our neural technology, Arm has focused on turning neural techniques from experimental ideas into practical capabilities. At Arm’s developer summit, we introduced Neural Frame Rate Upscaling (NFRU), a neural graphics technique that generates frames to improve smoothness in mobile games. Through the Early Access Program, developers can begin using NFRU within the Neural Graphics Development Kit ahead of its general availability.
Start building with Arm neural graphics technology through the Early Access Program
In Arm’s developer summit session – “Co-creating possibility: The future of mobile gaming” – Sumo Digital demonstrated how neural techniques can be applied today in a production quality reference game environment and then brought into a live Unreal Engine workflow. The value of this work is not just what’s on the screen, but in the technical implementation underneath it, including:
- How neural frame generation interacts with motion vectors and optical flow;
- How it coexists with ray-traced lighting; and
- How those systems can be structured to remain power-efficient on mobile.
In another session, “Neural Graphics in practice: Arm’s SDK for next-gen game development,” Arm presented our neural technology stack as a developer-ready pathway for bringing neural rendering techniques into real-world mobile game development. The session highlighted various features in the Neural Graphics Development Kit – including Arm’s ML extension for Vulkan, the Neural Graphics SDK for game engines, and ready-to-use Unreal Engine plugins – that act as a practical route for integrating neural techniques into shipping workflows without needing a bespoke research stack.

Two other very well-attended sessions at Arm’s developer summit reinforced similar themes around graphics innovation and early experimentation.
A popular talk from Infold Games showed how the studio deployed advanced real-time global illumination in Love and Deepspace. The session showed how techniques such as surfel-based lighting and radiance cascades can bring richer lighting and visual depth to mobile games while still working within mobile performance constraints.
Meanwhile, Enduring Games focused on the benefits of gaining early access to emerging technologies. The talk highlighted how experimenting with new tools and capabilities earlier in development can help studios refine their pipelines and stay ahead as graphics and gameplay expectations continue to evolve.
Start building with the world’s first open Neural Graphics Development Kit
Integrating AI into developer pipelines
Another key theme to emerge from Arm’s developer summit was how AI is being integrated directly into development workflows to enable more interactive mobile gaming experiences.
That idea was explored during a session led by Arm’s Kieran Hejmadi, which attracted a full house at Arm’s developer summit. The talk walked through a complex machine learning (ML) pipeline integrated into the Godot engine, built entirely with open-source models and community-maintained plugins. The session showed how large language models (LLMs) can power more dynamic non player character (NPC) interactions without relying on proprietary infrastructure.

What made the session particularly compelling was the accessibility of the workflow. For example, a team of students successfully assembled the system and deployed it to create novel gameplay scenarios driven by LLM-powered dialogue and decision-making.
The key takeaway is that AI-driven gameplay systems are increasingly accessible and easier to experiment with and integrate within existing development pipelines.
Foundational optimizations still deliver some of the biggest wins
One of the key takeaways from Arm’s developer summit was that mobile performance challenges are often easy to miss until they show up on the device.
As device capabilities improve, game studios naturally aim to design for newer specifications with richer effects, denser scenes, and greater on-screen activity. But visual expectations are rising faster than efficiency gains, and on mobile, power and thermal constraints still define what’s possible.
This matters because hitting thermal limits leads to frames dropping, throttling, and battery draining. These are visible to players, directly impacting their gaming experience.
This is where Arm’s Performance Studio makes a difference. As Arm Developer Evangelist John French explained in his session, “Performance tuning and upscaling for mobile development in Unity,” Performance Studio is a comprehensive suite of tools for profiling Arm-based hardware. Featuring Frame Advisor, Streamline, RenderDoc, and Mali Offline Compiler, the suite provides deep visibility into GPU behavior, making it easier to identify performance issues, isolate bottlenecks, and apply targeted optimizations confidently. As mobile rendering pipelines become more complex, that kind of visibility is essential.
Download Arm Performance Studio
Start experimenting with neural graphics today
The conversations at Arm’s developer summit at GDC 2026 made one thing clear: the next phase of mobile gaming will be defined by the smarter use of compute across hardware and development workflows.
Neural graphics techniques are moving from research into real workflows, AI is becoming easier to integrate directly into gameplay systems, and developers now have more powerful tools to understand and optimize how their games run on mobile hardware.
Together, these developments are helping game studios push visual ambition further, while maintaining the efficiency and stability that mobile devices demand. For developers, that means creating richer worlds, more responsive characters, and smoother experiences without compromising performance or battery life.
Now is the time for developers to start experimenting with neural graphics through the Neural Graphics Development Kit and Early Access Program, laying the groundwork to integrate advanced AI-driven upscaling and frame-generation techniques into game development pipelines.
Explore the early access program
Start building with Arm Neural Graphics Technology before anyone else
Any re-use permitted for informational and non-commercial or personal use only.






