The Physical AI revolution: How Universal Robots is bringing intelligent automation from lab to factory floor
Summary
In this episode of the Arm Viewpoints podcast, we explore how AI and advanced computing are revolutionizing robotics, transforming factories, warehouses, and workplaces into intelligent, adaptive environments where humans and machines collaborate seamlessly.
Host Brian Fuller, Editor in Chief at Arm, sits down with Anders Beck, who leads the development of Cobot Technologies at Universal Robots, a unit of Teradyne Robotics.
Together, they discuss:
- Why AI is the breakthrough that’s finally solving robotics’ flexibility challenge—from pallet detection to autonomous pick-and-place operations
- How Universal Robots and Mirror are deploying collaborative robots (cobots) and autonomous mobile robots (AMRs) to tackle human-centric tasks that were previously impossible to automate
- Real-world deployments already in production, like Ocado’s AI-powered fulfillment centers and Mirror’s intelligent pallet jack systems
- Why the workforce of the future needs to embrace robotics as a core skillset– not a threat – especially as demographic shifts create severe labor shortages
- How traditional high-volume industrial automation has plateaued, and why high-mix, low-volume manufacturing demands a completely different approach
- The critical role Arm technology plays in Universal Robots’ platform—from distributed computing across every joint to seamless integration of NVIDIA Jetson Orin for edge AI processing
- What “physical AI” means for the next wave of robotics innovation and why vision-language-action models will enable robots to learn and adapt through simple prompts
Plus, why the biggest barrier to robotics adoption isn’t cost or capability—it’s the shortage of robotics experts, and how democratizing access through low-code AI platforms and robust developer ecosystems is the key to unlocking widespread automation.
Whether you’re an engineer working on embedded systems, a manufacturer exploring automation strategies, or simply fascinated by the convergence of AI and robotics, this conversation offers an insider’s perspective on an industry experiencing its most exciting transformation in decades.
Speakers
Anders Beck, Vice President, Technology, Universal Robots
Anders serves as Vice President of Technology at Universal Robots, where he leads technology strategy and innovation for one of the world’s leading collaborative robotics companies, a unit of Teradyne Robotics.
With nearly 20 years in robotics, Anders holds a Ph.D. in Robotics and an M.Sc. in Electronics Engineering from DTU – Technical University of Denmark. His career began at the Danish Technological Institute, where he spent over eight years advancing flexible industrial robotics and leading large-scale European research initiatives before joining Universal Robots in 2017.
At Universal Robots, Anders drives the development of next-generation cobot technologies and physical AI platforms. He has been instrumental in integrating advanced AI capabilities into Universal Robots’ platform, partnering with companies like NVIDIA to enable real-time 3D vision and adaptive task execution. His work focuses on democratizing automation—making robotics accessible to manufacturers of all sizes through software-defined, low-code solutions.
Anders brings a unique perspective shaped by his background as a former Danish national team swimmer and elite athlete (1998-2005), where he competed at European Championships and World Championships.
He is a recognized voice on the future of industrial robotics, frequently speaking at industry events about physical AI, human-robot collaboration, and the transformation of manufacturing through intelligent automation.
Brian Fuller, host
Brian Fuller is an experienced writer, journalist and communications/content marketing strategist specializing in both traditional publishing and evolving content-marketing technologies. He has held various leadership roles, currently as Editor-in-Chief at Arm and formerly at Cadence Design Systems, Inc. Prior to his content-marketing work inside corporations, he was a wire-service reporter and business editor before joining EE Times where he spent nearly 20 years in various roles, including editor-in-chief and publisher. He holds a B.A. in English from UCLA.
Transcript
Brian: [00:00:00] Hello, and welcome to another episode of the Arm Viewpoints Podcast. I’m Brian Fuller, editor in chief at Arm, and today we’re gonna dive into one of the fastest moving application spaces on the Planet Robotics. Joining us is Anders Beck, who leads the development of Cobot Technologies at Universal Robots, a unit of Teradyne Robotics.
In this episode, we discuss. How AI is revolutionizing robotics, making them more flexible and efficient. How low-code AI platforms are accelerating robotics innovation, why and how the workforce will need to adapt to working alongside robots. How traditional industrial robotics is plateauing requiring new approaches and why he thinks armed technology is crucial for the next generation of robotics?
Let’s get to it. So Anders, welcome. Thanks for taking the time.
Anders: Thank you so much, and thanks for having me here.
Brian: Before we dive into the good stuff, share a little bit about your journey [00:01:00] in robotics. I get the sense that when you came out of the womb, you were playing with robots from an early age, and how did, how’s this shaped your journey from early on to now?
Anders: You’re right. Probably pretty much all through my professional career and even through education, I was very excited about robots. I, I have an elec, an a degree in electrical engineering, but even there, it was all about robotics control. Also all about high level control, low level control.
And I took a PhD in robotics and pretty immediately started, jumped straight in to, at that point a new and emerging field, which was all about flexible robots. It’s about how do we. Solve the tasks that are yet to be automated in the world today because the main barrier, the main sort of remaining gap of work and tasks that are no longer not yet automated in the world today.
All the things where there’s a lot of flexibility, a lot of variation and so on. Everything that has really high volume, very repetitive. [00:02:00] It’s good, bespoke, very fast and efficient solutions to that. But everything where product life cycles are short, that needs to finesse, that peop only people have is is still the things that we do manually in the world today.
I’ve been working almost 20 years with the, with that problem I say the last eight years within universal robots. I asked to come in and drive new product innovation, drive technology innovation. And also been responsible for corporate strategy. How do we penetrate open new markets?
Especially how do we do technology driven new business innovations? How do we use new products to get closer to to these problems? Because that’s also what universal robots it’s all about, right? It’s all about making it a lot easier to deploy automation, to get all the jobs that we still have out there that are largely manual today.
How do we. Effectively use robots to augment this work to AC accelerate it. How do we get more productivity with the workforce we have today? Within manufacturing could be within [00:03:00]logistics. And even today we see a lot of growth outside of these traditional industrial segments and into non-industrial applications as well.
Brian: Yeah, and in 20 years you’ve seen already tons of change. Let’s flash forward to the present. We have Universal robotics, which you’ve mentioned in Mirror, and they’re central to TE Nine’s robotics strategy. How do they compliment each other and what unique role do they play in advancing automation?
Anders: What’s unique about both Mirror and Universal robots is that they are what we call advanced robotics. They are, and what’s unique about advanced robotics is one thing is that they’re designed to be faster to deploy. Easier to use and designed to solve a lot of these, especially human-centric problems that we still.
Have yet to automate. And one of the big reasons is that it, software defined software is really the key driver for solving all this. It makes it fast to reprogram. It makes it adaptable it allows the use of a lot of sensory [00:04:00] systems to solve the tasks. And how they compliment each other is the mirror robots are mobile.
They are what we call AMRs, which really means they use natural features. They navigate around in factories and, logistics. And so all sort of settings just by looking at its surroundings and adopting and adjusting trajectories, courses, and even missions dynamically as they go. Similar universal robots are collaborative robots.
They are designed to be. Lightweight, they’re designed to be, have features that allows them to work, even unfenced close proximity. With humans, they can work really fast. Under safeguarding, they can solve a lot of different tasks. They also light, they’re easy to integrate, which really makes it a much, much more approachable way to bring automation into modern production.
And that’s really the common denominator for all of this technology portfolio when we think about advanced robotics that it’s approachable, it fits within. Your existing workspaces, it solves problems that people do today.
Brian: So you spend your life [00:05:00] watching science fiction movies about things that will happen sometime in the future, and then you wake up and they’re here.
These real world deployments are happening right now. Can you talk about some examples like the Mirror pallet Jack or the Universal Robots AI platform? ’cause that’s tangible now.
Anders: You’re completely right, like this category of advanced robotics has been in the industry for universal robots actually had a test 20 years anniversary yesterday.
So it’s it is something that’s been here for a while. What has really happened, especially over the last three, four years, is that AI has become such a powerful toolbox to unlock and solve these flexibility challenges that we have in manufacturing. Until now, we could rely on bespoke programming.
We may be able to make it flexible, but it’s still required expert programming. Now AI takes care of all of that and I think we can draw a lot of parallels to how we see AI starting to automate [00:06:00]workflows in the offices. Now we’re starting to see some of those things come back to to robotics as well.
Really good example is the mere pallet jack. The mere pallet Jack is the world first, fully ai. Powered pallet jack system. So it’s a mobile robot to drive around. It has its forklift so it can drive around in warehouses, and it leverages AI to detect and manage the whole pallet operations.
And what’s unique with that is if you think about pallets in real world warehouses, they look very different. You can have them shrink wrapped with black, shrink wrapped. Transparent, shrink wrapping. They could be painted, they could be full of stickers, they could be halfway broken. There’s so many ways a pallet could look like and if you use traditional old school sort of machine vision to detect the pallets, it would be very fragile to all these conditions.
With ai, we build a very robust. Scheme using generative AI that can really detect pallets almost no matter how they look. And even in its process [00:07:00] when it’s docking, when it’s approaching the pallets, it would continuously monitor if everything goes exactly as expected and react dynamically in real time if anything changes.
So it’s quite unique to use AI to make a pallet detection, pallet management system. Capable of reacting in real time to all of these. So it’s a product we launched at the end of last year and the sort of the excitement in the industry has been tremendous. That’s really high demand and and this amount of flexibility is really what the feedback we get from our customers are on scene.
So that is great. And similarly on, on the Universal Robots. We launched our AI platform end of last year as well. And that’s also gone into very rapid adoption across a lot of our customers. And even today, a lot of our customers uses AI and scales really fast. I think a brilliant example is one of our big retail customers in the uk Ocado, they uses, they use ai.
And our cobots in, in, as the backend of the fulfillment center. So even on their big sort of [00:08:00]matrix storage systems, they simply mounted a lot of UR cobots on top of of the aluminum infrastructure. And these robots are picking into the bins. They are using AI to detect all the different products they need to manage, and then picking, sorting, preparing the e-commerce orders even inside the warehousing systems.
And in the beginning when they started scaling this, the AI systems could manage and detect 15% of all the SKUs Today. They’re developing, they’re adding all the products really fast, and they expect within the next year and a half, they can manage 75% of all the products in this fully automated warehousing system even gets picked, sorted, and organized inside the warehousing systems using AI and robotics.
So I think it’s amazing that new opportunities that it can open because it no longer requires this bespoke and complicated programming. It’s all about learning and adding models to the system. Instead,
Brian: The innovation seems to have accelerated at a breathtaking pace. It was just a couple days ago when [00:09:00] robots were restricted to automotive factories, automotive, manufacturing, factories, and now.
As you’re pointing out, they’re being deployed everywhere. You mentioned ai, you mentioned the universal robots AI platform. Talk a little bit more about how low or no code AI platforms are speeding up this acceleration, putting more power in the hands of more innovators, faster. Talk about that.
Anders: Absolutely. So you’re completely right. The use of no code and generative AI and robot programming is taking up a lot of speed nowadays. And it is primarily driven, and I think we can go back to draw a lot of parallels in the way that AI tools have been adopted. We all can relate ourselves to how the generative AI tools came with GPT and so on over the last couple of years.
What the first beginning of the way we start is we started to get AI in as better tools. We learned that AI was a [00:10:00] very powerful tool for the computer vision side. So give the robots eyes and give its ease of recognizing new objects, learning how to react. That was the first milestone that we started seeing a few years back that started taking up speed.
What’s going on, especially in the Frontier Labs today, is we see really using generative AI to also generate robot behaviors. And is called it’s these vision, language and action models, which is really combining language models, the large language models we know from GPT. With vision. So the really, the eyes, the sensors of robots and the output instead of text and recipes and it actually outputs robot behaviors and actions.
And that is showing tremendous progress. It is. It shows robots where you can really ask them to, Hey, let’s pick up these two pencils and sort them by color. And of course a recipe for that is pretty easy to imagine. Chat. GPT could generate that. But then going from. Eyes of the robot to see that as in the vision pictures, [00:11:00] to then distilling the plan of how to do that using language models and then deriving an action sequence to make all of this happen.
That even at runtime, if you drop the, one of the pens, it was oh and re pick it up and so on. That’s quite amazing. So we’ll see a lot of these things come over the next couple of years. A lot of it is still in its infancy. There’s still a lot of unknowns, and I think robotics, of course, has the challenge that whether they were like the GPT, the large language models that we use a lot, had the luxury of having the internet as the data source for data mining.
You could build really great models. We don’t have the same luxury within robotics, so there’s a lot of work going on. Some of the work we’re doing with Nvidia, there’s a lot of labs working on doing data foundries, generating simulated robot behaviors and so on, simply to feed the beast with more data so we can we can understand everything that, that the systems can do.
Brian: So talk a little bit about the role. Lemme [00:12:00] start that one again. Everybody’s concerned about the role that automation and robotics in particular is gonna have on the workforce, even though we have labor shortages in specific spots all around the world at any given time. Talk a little bit about the role of the, cobot is probably not the right word, but the robot that is your coworker.
Anders: Absolutely, because you’re hitting the nail on the head with this question that today the discussion should not be, and in most places in the world, thankfully, it is not about should we do robots and what effects does that have on the workforce? There’s nowhere in the world today where we will not run into huge labor issues within the coming 10 years demographic shift.
An aging population is happening so fast that we would see even within, and the steepest decline is gonna happen over the next 10 [00:13:00] years. It’s gonna be a massive decline of the 25 to 69 year olds that can actually do work in factories. We see it in, in a lot of the Asian countries. We see it in, in Europe that it, we have not hit it as hard in the US as well.
That’s coming up as well. So one thing is the short-term labor shortages that we’ve seen, especially in some of the industrial optics. But we will have a more systematic challenge all the way across the world largely in this next decade to come. On top of that, working within manufacturing and logistics are the lowest two ranking jobs all around when you acquire young people.
So it will be difficult, continuously difficult to find young people that are actually ambitiously wanting to go into manufacturing and have their careers. So we need to think about manufacturing. We need to think about logistics and all these sort of manual work a lot differently than we have been historically.
When thinking about that, it will mean that today if you are [00:14:00] trained as a welder and missile working is absolutely important for our future. We need to keep doing it. But today, if you’re trained a manual welder your job might a hundred percent change over the next decade. Today, you should be trained the robot welder instead.
And welding with robots are really easy. There’s great software today, you can buy a welding system and if you work with some of our partners, they have systems you can buy front. You could get delivered within a week. You can go through a morning of training and in the afternoon. You would be personally welding with a roll robot instead of your manual welding torch, which would allow you to weld program the welds instead of do them manually.
It would allow you to do many more batches for the same quality, same repeatability, and so on. And I think there’s, those changes would happen to a lot of the work we see across the world today, right? Because we will need robots. We would need robots for all of this work. And and it would needs to be a trend.
Manufacturer, you need to accept that working [00:15:00] with robots is gonna be a skillset that needs to be built in to the skillsets of the factories, right? It’s not a thing you buy from somebody else and it has a red and a green button. You turn it on and off and it will be producing your products right for a decade.
That will not be how the world of future robotics will be. It’ll be something that, that you would ma need to master as your manufacturing skillset if you will, no matter if you’re small or large as a company.
Brian: Surgeons in the healthcare industry certainly have embraced robotics, especially for trickier more intimate kinds of procedures, and it’s working out great.
Anders: Yeah, it’s a very good example, right? That, that you would see that robotics enhances the capabilities of a surgeon exactly the same way you could do a big movement with your hand and the robot can translate that to a micro movement inside, inside the person. Or you can really have the safeguards.
You can have a much more sterile environment. There’s so many things you could do with the combination of [00:16:00] human intellect and capabilities and the sort of precision and and performance of a robot.
Brian: So you’ve said traditional industrial robotics is plateaued. In a way what, why do you say that?
And what technologies are enabling the shift towards a high mix, low volume approach?
Anders: So the way we’ve used traditional industrial automation has been as, as we talked about, again in the beginning, largely applied, and it’s a technology that really largely applies to volume manufacturing. So it means as long as you have production you can rely on for a long period of time.
Then investing in bespoke automation where things are like mechanically adjusted, built into size. Every cable is cut to exact length and manufacturing lines are built. It’s quite efficient if you have the same product in really high volumes. But even there, we know that just time to market is critical.
And these designs are costly. So even there we are seeing the demand for [00:17:00] something that’s faster and more flexible, but especially in higher mix and higher mix or even high mix, low volume, high mix, medium volume. And we see that demand even for most modern products, right? There’s a need for personalization.
Maybe people wanna order products in different colors, different if you think any modern car today, there’s like millions of different configurations as soon as you’re done with it. And that actually applies to quite a lot of products. IT products, when you build laptops and manufacturing lines, they come with different configurations and so on.
So it’s actually for the, even the laptop manufacturers difficult to automate these kind of workflows. So for all of this, we need something that is much more flexible, that can manage this. And and cobots is a perfect technology for this as it is. It is a very easy to work through software. It is very easy to apply these different manufacturing processes.
The same for the AMRs, for logistic workflows. Everything will be order based. It could be reconfigured on the go. And it could also be [00:18:00] thought differently because it’s actually possible to reprogram, re commission the robots. So if you have one type of production running for a couple of years, you decide you wanna shift your manufacturing to another type of product that is a better product.
You don’t need to throw the whole thing away. If you had a bespoke traditional manufacturing line, that’s what you needed to do. You need to either spend a lot of money rebuilding the whole thing, which is often too costly. It’s often easier to start all over. These new modern robots, you could just reprogram them.
And especially when we see the introduction of AI where it’s all about just adding new models. It’s adding new new check steps to it. It really makes the switch towards high mixable volume a lot easier. So we, we are quite excited because we are really seeing this trend and especially over the last year and a half, the desire to jump into this amongst large and small manufacturers have really skyrocketed.
So it’s very exciting times for robotics.
Brian: Yeah, you almost, at least in the Silicon Valley [00:19:00] you can’t turn a corner without bumping into a garage where there’s a robot prototype being built. Let’s talk about one of my favorite topics, which is arm technology and what it’s doing out there. Talk about how universal robots, teradyne, in general, have adopted armed technology and why, what are you doing with it?
Anders: Absolutely, because arm technologies have been a center pillar of our product innovation actually for this two decades that, that we have existed. The unique thing about advanced robotics is that it is really computationally driven all the way from the bottom up, right? So our robots have always been designed with distributed computing.
So even spread arm processes and communications infrastructure all the way around the robot because. Having a modern and efficient robot requires to have intelligence in every corner of the robot. It goes from server drives, driving the [00:20:00] motors, it goes from communicating all the way across the infrastructure.
And of course, now it really is all about ai. AI means a lot to us in in a lot of different instances. We are using AI on the edge. We are looking very much into ai, even on the smallest parts like. Using AI to optimize how we drive motors, how we compensate for physical phenomenon, how we react to to touch and feel and force all the way across.
How do we calibrate the robot so we have a continuous demand for on the edge inside the device, AI and high computing all the time. And of course, coming back to the larger models, the more performant ai. We’ve really excited how we can see this merger between AI processing and conventional computing because it allows us to go with conventional computing infrastructures and add the AI accelerated.
Workflows, the reasoning engine, the reinforcement learning based models that we can then integrate very tightly [00:21:00] into our robot systems. And ARM has really been, for us a key driving technology for making all of that happen.
Brian: One of the,
Anders: yeah,
Brian: one of the cool things you’ve done is you’ve integrated in Nvidia Jets and a G xor, which is based on ARM into some of the systems.
Why is that architecture compelling for your use cases?
Anders: Especially the in Nvidia Jetson, Oren and the Jetson family is for us very compelling because it, it gives us this seamless integration of very high performance AI computing together with a computing infrastructure that, that works well with our use cases.
An important part about the universal robot product. Is that it’s a platform product. We run our own software on our robots, but even more importantly, our robot is a host for a lot of innovations created by our partner ecosystem. Today we have a partner ecosystem and more than 380 companies that have designed grippers for our robots, vision [00:22:00] systems, for our robot, advanced software for our robots.
All sort of mechanical contraptions, like cable guidance and paint covers, and we have a huge ecosystem of partners that’s innovated a lot and what we allow them to run and deploy software innovations often together with their hardware innovations on our robots. And having an AI infrastructure that allows us to take these software innovation from our partners, not only to the traditional conventional computing, but actually adding AI computation on top of that for us has been extremely exciting.
And I think Orin is a perfect platform for that, as it is seamlessly integrate computational workflows from traditional conventional computing into AI and has a very mature and well adopted. Ecosystem around the the infrastructure as well. So it’s been very well received from our entire partner ecosystem.
Our internal developers, of course, love it because we can really have so efficient development workflows. And and we really [00:23:00] love the way that. These modern arm architectures allows us to, to leverage some of the highly embedded, highly performant capabilities of the computing platform, as well as the more conventional computing architectures, and gives us a much faster time to market.
It gives our customers and partners a much faster time to market, and it also provides them a reliable field proven platform that could be deployed in factories at scale. Which has often been the bottleneck of a lot of these advanced innovations. Then it the technology innovations were nice, but as soon as it had to deploy in factories it was difficult to get it out of the garage and figure out like what is really this industrially hardened hardware platform that you can actually get the robotic innovations into the factories.
Now with our use of the NVIDIA technology integrated tightly into the UR platform, we really can deliver that. Thanks for the arm architectures as well.
Brian: Terrible pun alert coming your way. You were a competitive swimmers, [00:24:00] a championship swimmer, right? What do you see as the next wave of robotics innovation, right?
We talked earlier about don’t look so much to the future because it’s happening right now, but there’s gonna be a very cool future and you’re inventing it right now. Talk a little bit about that.
Anders: Yeah. And there’s absolutely a wave coming. So I like that. And thankfully I’ve prepared over the years to swim in that I think physical AI is really, for me, the main driver.
And we would see physical ai and when I say physical ai it’s really the next gen, a generative ai. Think about the LMS that we see in so many other spaces. It is now starting to deploy and spread to to the robot platforms, and it will, it’ll empower robot platforms of any shape, form, or size.
Thankfully for us, it’s a perfect fit to the a MR platforms, the collaborative robot platforms, and it will supercharge the capabilities of of [00:25:00] the robot platforms They’re designed. To be for exactly this purpose. Today we have the world’s largest deployment into the university labs because they love our platform and we are seeing it really being adopted in a lot of the startups.
So that really excites us. But the innovation here is moving super fast. There’s still an innovation curve but. We will see it slowly coming. We’ll see it solve Things that are in the end, in the beginning are simple, but very rapidly, we would get maturity, predictability, and so on.
So we will reach these low code low programming scenarios where high mix, low volume, largely programming your robots by prompting, by having dialogues discussing, no, don’t do that. Do this instead. And of course, industrial users does need. To understand what’s going on. It needs the robot to somehow present to us, this is what I’m planning to do, and it should preferably do the exact same thing all the time.
Or at least if it reacts to changes, then it needs to be a little bit [00:26:00] predictable. So all those problems that we are working our way through, but but I absolutely believe this is a down downstream scenario. I think we have a tsunami of innovation coming. The the investments we see in ai.
Across the world largely, these moments are breaking down so many barriers and walls that if we think a decade back, it’s actually been a little bit of an innovation drought within robotics. We’ve, we had the collaborative robots come in. We tried a little bit a decade or two ago with ai and it didn’t really move the needle a lot.
Now we’ve really see these big breakthroughs. So we are just on the cusp of something that’s gonna be really exciting for for more advanced robotics.
Brian: It is dizzying in a lot of ways, but an exciting dizzying. So how do you see going forward, how do you see advancements in arm’s edge AI roadmap supporting your vision?
Anders: I think this is, they, the focus on the arm edge AI roadmap is exactly what we need and we, we need for [00:27:00] the more advanced AI models, we need significantly more compute. Than we have today. I think the enormous investments in AI data center shows the amount of capacity that the modern models needs.
We, I, we do experience, and we do see that at least for the Edge AI models there, there’s also some work ongoing that becomes more efficient, that they will be able to do more with less. But there’s no doubt we will absolutely need this sort of very nicely integrated AI architectures. That goes seamlessly from conventional compute to AI compute and allows the data sharing and without the historical bottlenecks that that more discreet systems have had over the years.
So I really foresee that having better AI in the smaller chips even so we can apply AI from motor control AI for small, signal processing all the way in the very tip of the spear. So having the stronger, more capable AI processing engines in the edge devices because there’s no way that any [00:28:00] robot can do all of its core processing in cloud service, right?
It needs to happen on edge in real time, predictable and reliable. And that’s exactly what what the advanced arm processes can give us. So I’m very excited to see the wave of improvement in performance improvement in capacity, but also an improvement of integration and specialty functions that needs to be accessible.
It also needs to be faster development flows for us as robot innovators, right? We need things that has really good SDKs that has really good compatibility. Fast time to market. So we don’t need to specialize in very sort of quirky instruction sets and so on which are really good trends that are happening all the way across the chip sector right now.
Brian: I know we’re bumping up on time, but I want to ask you two last questions real quick. Humanoid robots, I want one. However, from your perspective, what milestones do we need to hit to get that technology? To a [00:29:00] point where they can play a serious role in my life or the factory’s life.
Anders: Oh, yeah. And I fully understand you want one it’s, it would be I think the vision is so compelling.
I at least, I think first of all, I think it’s important to distinguish between all the advancements that happens within physical AI and the human form factor. They are, they’re two different things. And the innovations within physical AI will, will. Really impact all the way across every robot type if it’s a dog shaped robot or if it’s a robot arm, or if it’s a mobile robot, all of those would benefit from that. The human form factor is a great generic platform, right? We’ve, I think humans have proven that over centuries that it is extremely versatile. So in the future, of course, having the, a very functional, well-functioning human form.
It’s gonna be great, especially where high degrees of flexibility is needed. I think we’ve learned as well that if you’re largely doing the same thing, there’s way more efficient form factors of machinery [00:30:00] than humans. That’s why we build machines to, to even increase the effectiveness of humans. So I think there’s gonna be a great role for other types of robots.
That could be the collaborative robots, the mobile robots and all that for those kind of things. And then when you do need the extreme flexibility and that’s. A lot of versatile jobs. It’s probably working in your home. It’s all of those things. The humanoid form factor is a great form factor.
Some of the practical milestones that still needs to come is we’re still in the early phase of understanding the, sort of the safety impacts of these machines walking around. Those, just a report from the Humanized Safety Working Group laid out just last week that which is the first milestone of how do we even approach building.
Safety standard for all of these. So there’s a lot of that dexterity, like the fingers, the grippers, all of that is still a weak point, right? They I used to say that a lot that, that, that’s still a bottleneck, is really mechatronics for large scale adoption of robots for these very flexible dexterity tasks that it’s nice to have [00:31:00] the human arms, right?
But they largely do what robots have done for a long time. The tricky part is how do we control these fingers and what are the right configurations? How do we do that reliably? Their fingers are quite delicate devices and maybe we don’t need five fingers with the same degrees of freedoms as humans, but there’s no doubt there’s an innovation bottleneck.
There’s still there. So there’s no doubt there’s a lot of way to come for human humanoid robots. But I’m quite excited both seeing the amount of investments and the innovation that that the companies around humanoid robots are driving. It’s quite exciting and it’s, it will pave the way for so many things not also on the short term in terms of how to solve things and so on.
So there’s I’m quite excited to follow this, it will take some time until we really have a scaled adoption and worked our way through all these issues.
Brian: Last question, just riffing on that, what’s, what do you see as the single biggest barrier to broader robotics adoption right now? Is it cost?
Is it [00:32:00] practicality? Is it cultural?
Anders: Yeah it’s difficult. Difficult problem to put in one sentence, but if I should really cut it into the bone, the problem is that we have by no means enough robot experts in the world to what we need, because today to deploy robots at speed, we need an expert.
And and it’s gonna be the challenge for all of us to solve that problem with technology. We need to democratize access to robotics. And by. By making robots much faster to program, to deploy, to get to work. Not only get it to do something fast, but it needs to do it reliably with the level of performance that actually makes it productive.
And it needs to be flexible around all the things. It needs to be easy to use for the people. That will also drive the cost down, right? Because today it’s one thing is buying the hardware, but a lot of the cost is also driven by the, just the fact of the effort it takes. To integrate, to make, apply and deploy the [00:33:00] robots from either services that acquire or the owners of the robots.
If we can solve this with technology, it’s gonna dramatically decrease the adoption costs of robots and still be able to deliver on all these key performance metrics of robots. So that’s gonna be up to us to really take the capacity of the robotics experts and put it into tech.
And allow that to really break down the adoption barriers of robots.
Brian: Anders, I can’t remember a more interesting 37 minutes or so that I’ve spent talking to somebody and you’re in an amazing spot going forward. So best of luck to you and your team, and thanks so much for joining us.
You’re welcome. And thank you for having me.






