How Evolving Edge Computing Will Define and Secure an AI-enabled Future
Looking through the preview of key sessions at Embedded World taking place in Nuremberg April 9 – 11, it is clear that conversations at the event will be dominated by edge AI and the continuous expansion of AI-based applications in IoT.
Currently, there are billions of diverse connected devices – many of which have advanced AI-based computing requirements – with this number expected to double to 30 billion by 2030. This significant growth is alongside an exponential increase in AI inference – the process of using a trained model to make predictions or decisions on new real-world data – which will lead to an unprecedented amount of personal data being processed on the billions of individual endpoint devices, with each one needing to be secured.
How does prioritizing edge computing address challenges and benefit businesses and end-users?
Despite computing and security challenges, there is a shared consensus from across the technology industry to do as much data processing as possible locally at the edge – on a device – or a locally-based server. This will deliver the following benefits to businesses and end-users:
- Lower latency, with edge AI reducing the need to push everything to the cloud, which is especially important for safety-critical applications, such as the car and industrial IoT, where time delays in compute processing could lead to potentially devastating consequences.
- Greater reliability, with edge AI eliminating or reducing the dependency of moving connectivity back and forth to a centralized cloud.
- Lower costs, with it being more expensive to move large amounts of data back and forth between the cloud and edge while maintaining high bandwidth connectivity, rather than directly processing lower amounts of data at the edge.
- More privacy, as edge AI means people’s personal data stays on the device which leads to greater control over who can access this data.
- Better security, with edge AI representing an extension to the cloud where ensuring compliance with emerging cyber regulations, such as EU-CRA, NIST and UK-PSTI, becomes a must throughout the lifetime of edge devices.
In order to meet the demands of deploying AI at scale, edge computing must evolve. “Evolving edge computing” promotes hardware and software heterogeneity, frictionless development experiences and security at scale. All three support vital computing trends that will benefit the entire technology ecosystem, including ‘cloud-like’ development approaches, modular software, extensive collaboration and the removal of needless fragmentation.
Why is embracing hardware and software heterogeneity at the edge crucial?
Edge AI computing requirements vary greatly depending on use cases. Certain applications are tuned for minimal power consumption (e.g. to preserve battery life), while others demand large amounts of compute to support intensive workloads. Embracing this heterogeneity in both the underlying hardware as well as the software workloads is critically important. The foundation of hardware heterogeneity is Arm, representing the world’s most diverse, pervasive, security-backed architecture across a multitude of edge devices and use cases. This is especially important to developers, as they want access to the broadest range of hardware possible for their applications. It also allows developers to choose the best solution for specific AI-based use cases.
Embracing this heterogeneity creates the need for modular software that will work as seamlessly as possible across different hardware platforms from different vendors. The current challenge today is that in many cases connected devices have layers of bespoke vendor specific software that needs to be continuously maintained and updated during decades of deployment. This vendor specific approach, in turn, leads to high management costs and security vulnerabilities. By embracing modular software, vendors can instead focus on differentiation and value-add, as well as optimizing device maintenance costs by re-using standard software components, such as mainline operating systems and underlying firmware.
How does frictionless development enhance edge AI?
Maximizing software re-use is an important aspect of frictionless development for edge AI. Instead of creating new software from scratch for individual connected devices, software can be re-used across the ecosystem, which speeds deployment and the scaling of services. This approach is a key aim of SOAFEE, an industry-led collaboration founded by Arm that spans the full automotive ecosystem and value chain, enabling a consistent standards-based framework for software re-use at scale. While this is targeted for the automotive industry, the software-defined vehicle (SDV) is effectively a connected edge device with over-the-air (OTA) software continuously updating key features and applications. Therefore, there are lessons from SOAFEE that can be applied across the broad spectrum of diverse connected edge devices.
Another key aspect of frictionless development is embracing new cloud-like approaches. These are designed to break traditional linear embedded software development flows that happen in isolation, target specific devices, and only start when the hardware becomes available.
Cloud-like development approaches enable software to be developed in the cloud and integrated into CI/CD workflows for continuous build, test, and validation. Developers can therefore leverage the benefits of developing their application on virtual platforms ahead of hardware being available and follow more agile approaches to the deployment and enhancement of features. This leads to a more efficient, streamlined development process and a quicker time-to-market for developers.
What role does security play in scaling edge computing?
However, an improved development process requires a trusted and consistent approach to security through openly available standards. Arm SystemReady, PSA Certified and PARSEC are some examples of key initiatives where such standards have been set, with a collective shared responsibility for the ecosystem to conform and bring the most opportunities to everyone who participates. These standards also help to minimize the layers of software across individual connected devices and tackle fragmentation.
Each standard also has a range of individual benefits. Arm SystemReady, which was introduced in 2020 to standardize how operating systems (such as Linux) securely install and boot on Arm based devices, helps to minimize software maintenance costs through the deployed lifetime of the device. PSA Certified, co-founded by Arm, delivers an independent certification evaluation scheme for IoT hardware (Root of Trust or RoT), systems-on-chips (SoCs), system software and devices, which helps to give the ecosystem confidence in the security of connected devices. It also aligns with emerging cybersecurity regulations, such as EU-CRA and UK-PSTI. This brings assurance to device vendors and enables connected devices to be deployed at scale. Finally, PARSEC, which is an open-source initiative providing a standardized approach to hardware-abstracted security services, such as cryptographic services and secure storage, helps to remove the barriers associated with securing the multitude of edge AI applications that are being developed and deployed today. This also helps to increase software re-use and portability.
At its Embedded World booth, Arm software experts, Linaro, are showcasing a new project to support the Arm ecosystem to test new and updated device firmware against these standards in the most efficient way possible. As more partners sign up to the project, ONELab – powered by Linaro, more seamless, agile and frictionless development experiences become possible.
Why is ecosystem collaboration essential for evolving edge computing?
Edge computing is accelerating a rapid digital transformation across the broad spectrum of technology devices. This represents an incredibly exciting business opportunity to exploit new AI-related software capabilities, but there remain industry-wide challenges. Evolving edge computing addresses these challenges by promoting well-designed, collaborative industry initiatives to enable continuous advancements around frictionless development, hardware and software heterogeneity, and security at scale for all participants. However, this means unprecedented ecosystem collaboration to scale, support, and secure the billions of connected devices in the future. The end result will be a win-win for the ecosystem and, more importantly, the end user of tomorrow’s AI-based services that are increasingly moving to the edge.
Learn more about evolving edge computing by downloading the latest whitepaper or attend Arm’s Embedded World Exhibitor Forum: ‘The Intelligent Edge and Scaling Compute Wherever it is Needed.’
Any re-use permitted for informational and non-commercial or personal use only.