Arm Newsroom Podcast
Podcast

Arm Viewpoints: Accelerating Edge AI Innovation — with Arm’s Paul Williamson and VDC’s Chris Rommel

The Arm Podcast · Arm Viewpoints: Accelerating Edge AI Innovation — with Arm’s Paul Williamson and VDC Research’s Chris Rommel

Listen now on:

Applepodcasts Spotify

Summary

In this episode of the Arm Viewpoints podcast, we explore how Edge AI is reshaping the embedded and IoT landscape, marking a new era of smarter, more capable systems at the network edge.

Host Brian Fuller, Editor in Chief at Arm, introduces a conversation between:

• Chris Rommel, EVP, IoT and Industrial Technology at VDC Research

• Paul Williamson, SVP and General Manager, IoT Line of Business at Arm

Together, they discuss:

• Why Edge AI represents a major inflection point for embedded and IoT developers

• How Linux, Python, and open-source tools are transforming the software stack

• The growing role of Arm’s heterogeneous compute—from MCUs to NPUs—in accelerating AI workloads

• What VDC’s latest research reveals about the challenges engineering teams face in deploying edge AI

• How Arm’s developer tools, SDKs, and ML frameworks are helping teams move faster and more efficiently

• What engineering leaders can do today to future-proof their software and system architectures for AI innovation

Plus, why success at the edge depends not just on powerful hardware—but on flexible platforms, trusted tools, and a vibrant ecosystem that connects silicon to software.

Whether you’re an engineer optimizing AI at the edge, a systems architect planning your next-generation platform, or simply tracking how IoT is evolving, this conversation offers a front-row view of the future.

Speakers

Chris Rommel, Executive Vice President at VDC Research

Chris Rommel, Executive Vice President at VDC Research

Chris leads the firm’s IoT/Engineering & Industrial Technology practices along with Sales and Marketing operations. With over two decades of experience as a technology consultant and industry analyst, Chris provides data-driven insights to technology companies and the investment community. A recognized thought leader in IoT, Edge computing, Security, AI/ML, and Software Development, he serves as a trusted advisor to leading firms navigating automotive, defense, medical/life science, and industrial markets. Chris is a frequent speaker at industry events and holds a Bachelor’s degree in Business Economics from Brown University. Outside of work, he volunteers as a coach with Hopkinton Little League and Basketball Association.

Paul Williamson, Senior Vice President & General Manager, IoT Line of Business

Paul Williamson, Senior Vice President & General Manager, IoT Line of Business

Paul leads the IoT line of business at Arm. He previously ran the client, security, and wireless businesses at Arm and has been involved in connected devices since the early days of Bluetooth, developing innovative products for consumer, medical, and industrial markets.

Prior to joining Arm, Paul led the low-power wireless division of CSR, a fabless semiconductor business (now part of Qualcomm). Paul started his career in engineering consultancy, working with leading global brands to develop innovative products and services. He holds an MEng from Durham University.

Brian Fuller, host

Brian Fuller, host

Brian Fuller is an experienced writer, journalist and communications/content marketing strategist specializing in both traditional publishing and evolving content-marketing technologies. He has held various leadership roles, currently as Editor-in-Chief at Arm and formerly at Cadence Design Systems, Inc. Prior to his content-marketing work inside corporations, he was a wire-service reporter and business editor before joining EE Times where he spent nearly 20 years in various roles, including editor-in-chief and publisher.  He holds a B.A. in English from UCLA.

Transcript

Highlights:

[00:02:00] Edge AI as a tipping point. Key points from VDC’s recent research.

[00:08:00] Developer adaptation and new skill sets.

[00:06:00 / 00:13:00] Arm’s ecosystem advantage.

[00:15:00] The shift to heterogeneous compute.

[00:20:00–00:24:00] Software and tooling as critical enablers.

[00:27:00–00:30:00] Future-proofing for AI innovation.

Brian: [00:00:00] Hello, and welcome to a special episode of the Arm Viewpoint podcast series. I’m Brian Fuller, editor in chief at Arm. And in this segment we’re gonna hear a special conversation between Paul Williamson Arm’s, SVP, and General Manager for IoT. And Chris Rommel, EVP of IoT, and Industrial Technology at VDC Research.

In the conversation, they’ll discuss why edge AI is a major inflection point for embedded systems and how developers are adapting the rise of Linux and Python and IoT development, and what that means for software tooling, how Arms broad portfolio and partner ecosystem are helping OEMs deliver smarter, more scalable edge solutions.

And what engineering leaders need to do to future-proof their software and system architectures for AI innovation. Let’s dive right in.

Chris: Hello and welcome to our webcast Accelerating Edge AI Innovation. My name is [00:01:00] Chris Rommel. I’m the EVP of IO OT and Industrial Technology here at VDC Research. And today I’ll be joined by Paul Williamson, who’s the SVP and General manager of the IoT Business unit at Arm.

So before I go any further just wanna provide a little bit of background for those that are familiar to tell you who VDC is and what we do. We’re the leading market research consulting company focused on systems engineering and the embedded technology market, and we’ve been tracking the requirements and preferences of that global community.

Of engineers for over 30 years, and today I’ll be sharing the results of some research we conducted sponsored by Arm, in which we captured the insights from over 275 engineering professionals. And from that research, we can offer some insights into the leading business and technical trends that are impacting development organizations today, as well as some of the best practices to address those trends.

And over my time at VDC, I’ve seen the market hit various points of inflection. At times we keep trends converge and really challenge the [00:02:00] status quo. And now is no different. And this is really one of those times. That new trend, that new change, it’s edge ai, that it has become a new frontier for differentiation and is really pushing development organizations to adapt and really evolve more rapidly than they did during what we were looking at just a few years ago in the rise of IoT.

That its systems are becoming more intelligent, running more sophisticated work workload. But all that said with a what is really a universal recognition. The need for Edge AI and the capabilities around it is the fact that transformation to h AI and its capabilities with it, it’s uneven. There is a range of different maturation of engineering organizations.

And it’s really pushing a number of engineering organizations into uncharted territory. They often lack the resources or expertise to efficiently bring new solutions to market. [00:03:00] And on top of that, many of those. Organizations, they still have those same traditional requirements for compute and the design requirements, be it real-time requirements, security, safety, critical, those are all still there.

But the pursuit of edge ai, which you can see even on with the. Huge jump in three years. On the chart on this slide, it also brings with it new challenges. It requires a reevaluation of your existing tools, your components, the resources and human capital you have at your organization. And in fact, that survey I mentioned showed that next to cost, actually lack of experience is cited the top issue that’s impacting software and system development.

And to give a brief example of some of that intersection of change and skillset gaps as a part of this research. I spoke to an engineer in the agriculture industry. It was a really interesting anecdote. Interestingly enough, they designed a machine to drive cacao, the key ingredient in chocolate, and the machine that we’re building uses a combination of hardware and software and machine [00:04:00] learning to monitor and control the drying process of the beans, looking at everything, temperature, humidity, and rotation.

To me, it was really interesting from a couple angles. First of all, from just a harder perspective, we’re using a 32 Arm-based MCU for some of the sensory inputs and motion control, and then a more sophisticated Arm-based SOC separately running Linux for the more high performance, mi ai ml application.

And for software development, then the company used both C which they a lot of experience in for programming for the MCU and then Python for the SOC. And here’s where it’s pretty interesting, I think is a really cool commentary in the state of the market is that. As part of this development there, it was really a process of learning for this organization.

’cause they said they didn’t have a lot of background in this arena, aside from recognizing a need. And they recognized then and admitted they had not yet really fully optimized the machine learning and the integrations in the libraries like [00:05:00] TensorFlow. But they were talking about already for the next project, looking actively looking for ways to improve.

And so to me, I think that’s ideal where you. Are striving to innovate, pushing your organization and its comfort level, but you still recognize that you and your team need to do better. You need more emerging skill sets and new solutions to help you get your ideal product to market and continue to innovate.

Paul, how does that. That anecdote and this data resonate with what you’re hearing and seeing in the ecosystem and how you’re enabling it.

Paul: Yeah. Thanks Chris. It’s a great anecdote and I think some of these use cases are really informative. We at Arm have to look five plus years into the future to understand really what are they gonna be, the needs of developers to land, the key workloads they’re gonna have in that timeframe, and how do we serve them best?

And we’ve been seeing this growth of using AI and AI tooling. To solve compute problems at the edge for some time. [00:06:00] We’ve had to react to that. We’ve tried very hard to ensure that we’re doing so in a way that layers the technology into the existing compute platforms in a way that is familiar and accessible to developers.

Whether that be the improvement in. AI instruction operation, even on our CPUs in our A class products or as you pointed out, bringing a class and Linux based development down into smaller form factors. So the launch of our V nine Cortex, a three 20 to allow you to run Linux on smaller and power efficient systems through some things like embedded neural processes.

The world is becoming somewhat heterogeneous and as you’ve mentioned, that presents a sort of develop a challenge. How do I. Most efficiently run my AI algorithm. So building things like our Ethos U NPUs, which are deployed in these embedded microprocessor systems and microcontroller systems and making sure that the software tools allow the developer in their normal flow to deploy an AI algorithm efficiently into one of those [00:07:00] end devices.

Those have been key focuses for us. And I think align with what you’ve heard both anecdotally and somewhat in this survey with the growing importance of, being able to deploy AI as a key workload.

Chris: No, that makes a lot of sense and it’s good to know you’re seeing those same things too and are making the investments to help your customers adapt to that as well.

So for years, looking at the IoT and embedded market, it was almost defined by its fragmentation. That was really a unifying theme where you had diverse systems and form factors, configurations, and different operating requirements that created an ecosystem of highly specialized solutions and. That certainly is still the case in many ways.

And nowhere was that dynamic more apparent than within the OS landscape. We had dozens of different operating systems designed to speak, meet very specific sets of requirements and resource constraints or performance. But now what we’ve seen, and Paul just touched on this a bit, is [00:08:00] that.

These new requirements for advanced functionality and Edge AI are driving the adoption of rich operating systems and open-source operating systems like Linux have gained an increasing adoption in the vetted marketplace. So you have that combination of increasing contributions. In the Linux kernel towards the real-time domain, a maturing ecosystem of solutions and experiences at many engineering organizations really have helped push Linux, adopt the new heights.

And now in fact, over 50% of IoT engineers that we surveyed said they’re planning to use an open-source operating system in three years, and that’s as a primary o operating system. Again, there’s still very much a need for other options in those multiprocessor systems like the one I described before is a great example of where you can see that, but it also shows the shift and some of the important new considerations and best practices for engineering organizations making this shift, or even looking down the road and recognizing a need for [00:09:00] edge ai and thinking about what that means for the runtime down the road.

Ultimately, what we saw stand out in the data when we asked, those folks that were then moving to adopt a rich operating system when they didn’t have one in the past, it became very clear that tooling support is paramount. They have to ensure both optimization for their new target hardware, as well as efficient software development for these next generation workloads and OEMs can look to a range of sources when they’re thinking about tools.

In addition to standalone tool providers and those provided by some of the OS suppliers as well, semiconductor ecosystem participants like Arm provide a range of SDKs for software and machine learning model development. That can be a great head start as well on that topic of software. Some of the challenges, and this touched on with that point of recognition and self-assessment that agriculture.

Engineer was discussing with me for content growth continues to be one of the most challenging and transformation transformational issues that are facing those organizations. Code [00:10:00] growth requirements beyond go beyond just needing to scale internal resources clearly. That’s always a big crunch when you just need to develop more faster, but today you also have to think about what you’re using to develop that software.

And how that aligns with your organization’s skill sets and some of the solutions you’re using. For decades, embedded software development was mostly conducted in C and other lower-level languages. Now, however, embedded engineering as we discussed, is no longer synonymous with just small footprint fixed functions fixed or single purpose.

Devices today, when you think about Edge AI systems particularly need to perform multiple sophisticated func points of functionality. They have multiple processors or cores and the requirements for that edge ai e and the ecosystem around it, and the line of have again that people wanna tap into, they really have recast the languages of choice that are needed for this next step in innovation.

And, for example, around. [00:11:00] Ai, we’ve seen a rapid increase in the use of Python, as well as other languages that are targeted at heterogeneous compute or HPC, like Julian or MATLAB, or R, and it’s changing many organizations’ needs. Then on the backend. Plus the added thing now is, as I noted, they’re no longer fixed function devices.

You have an environment in which it’s become an expectation. Then that software is going to continue to evolve post appointment, and that brings with an even greater challenges and need for change in how one thinks about. Their software development and the organization that supports it or their solution development.

Ultimately, for many organizations, the status quo, no, no longer suffices. And in order to build and deploy across a range of devices, engineering organizations must identify new platforms that can offer opportunities for integrations with things like standard AI frameworks, while also offering and compatibility with legacy code bases and ip.

And or additional opportunities for integration and [00:12:00] synergy with hardware. Ultimately, these organizations are trying to do more. They need to use new platforms to do and so in order to do that, they need to look for partners that can offer solutions that help them bring more integrated solutions market faster.

Paul, when you think about the change in software, be it rich OS or software development. How has that changed how you think about the market and what different organizations need in order to be successful?

Paul: Yeah, it’s a good question and I do think this is some of the most interesting insight that really drives our thinking about, and we’ve got almost as many people working on software, frankly, as we have on hardware and compute systems.

And this trend in change has been happening for some time. We’ve seen really in the sort of era of connectivity becoming a bigger thing, more complex software stacks emerging, but particularly as we moved to. The sort of edge AI challenge, the challenge around handling more data in these more complex systems has changed some of the programming focus to these higher-level languages Now.

The [00:13:00] underlying need for those real-time control systems that you mentioned isn’t going away. And we do still see that consistent need for those deeply embedded systems for power management and for, real-time functionality. But to handle this data and generate these insights locally, we are seeing people push for Python and higher-level languages, the use of richer operating systems to be able to handle the scale and volume of data involved.

And that means we’ve had to respond to that with also our hardware roadmap. But our software and tooling support we put into the ecosystem to enable this. A lot of our focus on our heterogeneous accelerator has been around bringing in PyTorch exec Torch support, for example, or our c cloudy libraries that allow people in embedded Linux to be able to deploy AI efficiently on the underlying hardware.

So a big focus for us on, meeting the need this trend is signaling in the growth of the use of languages like [00:14:00] Python. We see that as being driven by this need for AI data. So a strong alignment with what we’ve seen in discussions with our partners.

Chris: Great. Let’s take a step back from software for just a minute, as important.

That is. I think it’s important to recognize that the AI processing landscape is incredibly diverse and complex, and with that change is underway in the ecosystem and highlighting some other, I think, really. Interesting research from the survey we conducted. You can see the standing of some of the different architectures as engineering organizations Think about the next generation of the designs, and probably no surprise to many of you is that given all this change in the ecosystem has continued to make progress.

Innovate and it’s come to represent the leading choice among engineering organizations. Not only is it already cited by over 50% of engineers we surveyed as their primary architecture for their current project, but when we asked them what they expect to use on a similar project in three years, over two [00:15:00] thirds of them said they expect to use Arnold.

It ultimately, when I think about those results, what that means to me is that no one can really afford to lock themselves into closed platforms that risk limiting corporate agility or renovation with a platform and portfolio that really scales from those smallest footprint MCUs to highly performance CPUs, GQs and MQs.

Arm presents the ecosystem, a range of solutions to both in the semi connect supplies as well as the OEMs. And that portfolio breadth is, very important because of the increase increasingly complex systems that we mentioned the SOCs need for. Support those next generation designs.

More organizations look to incorporate different types of accelerators for those advanced workloads. And another example, it’s not shown here on this slide, but the use of NPUs, like Arm ethos is expected to nearly double in the next three years. So all of these different options is and there’s other things out there obviously, but it’s shows me that it’s [00:16:00] critical and many organizations are recognizing a need for both the sys performance and efficiency that they need to look for new solutions, the best in class, but they see it around semiconductor technology.

But then tying back to again, what can help their organization meet their goals and do so with the best operational efficiency for the BA development perspective as possible. And that ties back to another conversation I had with an OEM as part of this project. Where we I was talking the person was in the medical field and they were developing a new eye tracking system that provides assistive typing technology to people with disabilities.

And it had been an old platform or technology that hadn’t been updated that much recently, and they wanted to leverage some of the newest technology that was available and create a new long-term platform for their future product development. All with this goal of aiming further solutions to reduce its costs, its size, and also the noise that it made compared to previous versions for [00:17:00] the sake of usability within their community.

And in order to address those more complex requirements, the new iTrack tuned system they’re designing, they decided to use an Arm-based processor in instead of replacing what was a series of 8-bit microcontrollers in that old design. So this new design that it allowed for not only more processing power and integrated features, but it also helped them reduce the complexity of their system and also reduce the bill of material in terms of hardware costs.

So I think that was a really. Interesting example where there was this, a recognized need for change at the organization. Engineering organization then saw there’s new platforms available that could meet a number of needs, and then at the end result, they actually got something not only simpler but could save hardware costs for the organization as well.

All providing a foundation for not only their current pro product, but future generations to come. Paul, [00:18:00] as you think about how the ecosystem has been changing and some of these architecture shifts that. Engineers we surveyed. We’re seeing, how does that resonate with you in terms of what you’ve seen in your own shifts in the ecosystem?

Paul: Yeah, thanks Chris. I think I love the sort of wording used there of platform and it is we try and think about this as a platform of innovation. And I think in those embedded use cases, clearly it’s really reassuring and positive to see that sort of growth in the use of embedded accelerators like ethos in these designs.

And it. Speaks to the need of being and the ability to solve really interesting, challenging problems with something that is that sort of flexible platform. So by embedding that capability, it provides you that scalability of future capability but also doing it in that real power efficiency and cost efficiency.

For your future platform? I think almost more striking for me is this drive to these higher performance systems as well. And the stat there you have for the growth in future design of Arm into these. [00:19:00] Embedded systems, and I think it speaks to what we’re seeing. We have this community of over 20 million software developers and they’re moving workloads and software from cloud to edge.

And so some of these applications are having to manage both and having this common software footprint across that to deploy in these edge nodes based on Arm but having access to silicon that targets their specific needs, that flexibility in the edge. Coupled with that common developer channel to the increasing use of Arm in the cloud, really reinforces that sort of value to the ecosystem of the Arm platform.

And seeing that sort of reflected in people’s perception of how they will use the technology, I is really great to see. So I’m, pleased to see we’re gonna see, more performance systems leaning towards Arm in the future because of those factors. So yeah, it, it aligns closely with what we’ve seen.

Great.

Chris: With increasing software complexity and some of the scarce development resources and ultimately some more demanding and [00:20:00] user requirements. We’ve, I think we’ve already established that it’s challenging engineering organizations to adapt. You have these exploding code bases, new functionality requirements, and everyone needs to try to find.

Ways to simplify and accelerate software development and to do it’s very important to try to recognize what those key choke points are on edge AI development, resources, and productivity. And reflecting back on some of this research, when we asked developers and engineers what the most important characteristics were on their edge AI projects, software development and ability to abstract code, it was cited as more important than power performance. So as important as, processing elements are they themselves are just part of that equation. And again, it’s important to, to think about what are those things that, that you can do to identify ways to reduce complexity.

Accelerate development process and to me that really highlights the importance of identifying whatever solutions there may be and [00:21:00] ecosystems of centers of gravity that can help organizations be successful in that regard. And to address next generation to edge AI requirements to that end engineering organizations, you really need to look for those tools that can speed their development and offer that level of optimization for whatever next generation hardware they may be using.

And many organizations, as we noted, can and should look at the semiconductor and semiconductor IP organizations like Arm and Licensees as a first step for the development solutions. And as part of this research, we asked organizations to have rank and rate some of the different sources of solutions that are out there.

And Arm actually is in the survey was recognized as the clear leader in providing edge AI software development solutions. And while many different vendors have other helpful tools and solutions out there, Arm was actually consistently [00:22:00] rated higher than its peers in this research. And in fact, for some of these, you can see it’s effectively cited at rates.

50% greater than its closest competition. Again I think that’s really interesting from a couple perspectives. One, I think it does reflect Arms long history and providing leading development tool technology, everything from Keil, which has invested when it launched its m class processing to when it launched virtual hardware platforms to its recent, Ben nine Edge AI platform launch.

The combination of Arms history with it and its unique ecosystem position to cons. Its consistent investment and recognition of the need to invest. I think that’s put it as a position where it’s a widely embraced IP provider. It also has, can offer these software design capabilities to the ecosystem to ensure that everyone can be as successful as possible in trying to bring these more sophisticated edge AI applications to market.

Not only now, but in the future. Paul, as you think about the results to this [00:23:00] research as well as the own investments that your team at Armor are making in this ecosystem. How does this make you, reflect on that as well as plans to help the ecosystem in the future.

Paul: Yeah, really hardening to see the numbers here and as you say, it’s something we’ve done for a long time.

I think Kyle now is something we’ve been investing in for 20 years. And we are continually evolving that as well as expanding the support we give into the ecosystem. One of the things I also recognize in looking at this is. A lot of the work that we do at the Foundations is supported by our partners who build solutions on top of this that expand the applicably applicability of Arms offering into the broadest possible range of software solutions to give our partners, the ability to build really compelling products.

But we do continue to focus on it, whether it be. Taking our partnership with Kyle to bring it to working with GitHub to ensure people can use modern develop flows to areas of investment [00:24:00] like light RT and PyTorch, and making sure those land efficiently, whether you are using the CPU or an ETHOS entity, you in your developer flow.

We have to think about how do we. React to the changing needs in the market and how do we continue to support with the base platform, with the work we do here at Arm and the partnerships we build to make sure we’re meeting the needs of developers for the long term. I’ve always touched on, there, there’s a lot that is changing for.

Exciting reasons around Edge ai, but there’s also an underlying need for long-term trust and stability of the tools. The car investment has been one that has taught me a lot about the long-term trust that is placed in us with these tools to allow our professional to, build a system and to be able to come back and modify and improve that system over time.

Providing that consistency across silicon platforms and performance points is really critical to what we do. Heartening to see it recognized and, something that we continue to look at. Where do we invest next to [00:25:00] meet this edge AI wave and this desire for higher performance. Linux open source-based platforms, which changes the metric a little bit from the, constrained, embedded RTOS.

So we, you’ll hear more from us. We’re absolutely looking at this space, Chris, and pushing forward to make sure we support these new capabilities. That’s great to hear.

Chris: Obviously, our research confirms what I think a lot of people, no believe and expect from our, in terms of its leading positioning, helping engineer organizations try to address their next generation requirements whether Edge AI or otherwise.

But I do think it’s, clear to think and reflect that, edge ai generally it’s in terms of in inspect today, it’s rendering many organizations in the incumbent engineering technologies and that design processes they’re using is insufficient. They, but now we’re in a situation where, in many cases, technology that was first deployed and developed for client enterprise systems, whether it’s from GPUs or programming [00:26:00] languages like Python, it’s providing a new catalyst for the industry and a mechanism for rapid evolution.

Already the growing role at software plays within electronic system functionality and differentiation. Has it upturned a. Ecosystem, traditionally averse to change. You look at many of the safety critical industries, those traditionally have been very slow moving. And while many of those requirements we discussed, whether it’s real time or otherwise, those are unique and relative and to a degree in elastic, they’re such as connectivity, enterprise integrations, security, and ai.

Those are becoming much more important and the fabric for continued edge innovation really. It’s woven together by the different choices that are made on the software side and the things in the different hardware components below that as it pertains to software OEMs face mounting pressure to design and deliver software as a seamless part of that overall system architecture.

And. Again, need [00:27:00] a foundation for that entire product lifecycle. And so as an engineering organization, you need to think about what new solutions you need, how those address your evolving requirements and how they hopefully can help you speed your development and time to revenue. And we just, obviously there are a couple different examples.

We’ve talked about a few times. Open source and other rich operating systems really jump off the page in terms of the correlation with adoption around Edge AI With over 50% of engineers already planning these open-source operating system within the next three years and beyond just tapping into rapidly maturing ecosystems like that on Linux, it’s really critical in organizations.

It takes steps to future-proof their designs. Architect their systems with flexibility and evolution in mind and that really. Makes them need to think about the pace of climate change, not only now, but in the future, and how they need to think about building devices that are capable, that [00:28:00] evolution, both in terms of the functionality of ships as well as what can happen post-deployment.

And with all those changes and choices that organizations have and AI and all advanced, really pushing the edge or pushing the skill sets and needs of organizations and the potential that they have for their devices, they, these organizations need to look for things that are highly optimized for those new workloads, those new.

Optimized AI demands. And when they look then and recognize the value in those systems, it comes with a need to select a partner that’s capable of providing the needed combinations of products, scale and support. And Arm ultimately through our research proved itself as an example of that company that has built a foundation of products and expertise that can serve.

I industry folk now in the past, in, in the future, and it’s been doing so for decades as we’ve discussed, and with the complexity of those IoT systems on the Rise for Edge AI and other workloads, it’s really important as an organization to think about that partner that can [00:29:00] be someone you continue to work with.

Not only now, but in the future. And we’ve seen that reflected in Arms continued advancement and investment in its IP portfolio, as well as its software development resources and really positioning its customers and OEMs alike as having a trusted partner that can help them continue to innovate going forward.

Paul, as we reflect back on the discussion today, some of the research that we conducted, what really stands out to you or any final thoughts that you think our audience should have? With them.

Paul: Thanks, Chris.

Some of the. That we see in this industry and the challenges that are faced by developers, the ones that, I was, hearing anecdotally, but it’s great to see it in independent data. So thank you for developing that for us. And I know we’re gonna share that more widely, but I think, you’ve picked out some of those key [00:30:00]trends we’re seeing and at Arm we’re really committed to understanding how do we support this need for higher performance, these open-source operating systems and, developers across the industry and they’re challenged to bring the benefits of AI to life. We can I guess move forward knowing that we’re doing something that looks to be needed and further reinforcing the value we can bring with the Arm platform. Thank you for bringing us the insights.

Chris: Great. Thank you very much, Paul for your time. This has been really interesting to look at the evolution of the Edge AI ecosystem together, and we’re certainly looking forward to more discussions in the future.

 

Subscribe to Blogs and Podcasts
Get the latest blogs & podcasts direct from Arm
promopromopromopromopromopromopromopromo