Moore’s Law has long acted as a roadmap for the tech industry. What started as a prediction – that the number of transistors on a chip would double approximately every two years while the relative cost decreased – became a blueprint for progress, a self-fulfilling prophecy that has seen processors shrink from 2,250 transistors in an area of 12 mm2 to current designs with more than a hundred million transistors per square millimeter.
These advancements have transformed the user experience, fuelling increasingly sharp digital images, high-fidelity gaming and ever-more accurate speech and image recognition as computers and mobile devices become more powerful and more streamlined with every iteration.
But, as I wrote back in 2019, Moore’s Law is nearing its end. The rate of progress is slowing – transistors are getting so small that there are just a few dozen atoms along their gates, and the structure of individual grains of polycrystalline copper is a key consideration in signal timing.
So, what comes next?
The impact of climate change on technology development
When Moore’s Law was originally conceived, computing was relatively new and exciting. Despite scientist Eunice Foote Newton having first demonstrated the greenhouse effect as far back as 1856, human-influenced climate change was a possibility that was only just beginning to be considered – and that was partly because the computers needed to make the calculations were just becoming available.
Any correlation between the climate and the energy consumed by computers was not really on anyone’s radar – mostly because while computers were energy-hungry beasts, there were relatively few of them about.
Fast forward to the 21st century, where connectivity has become a basic necessity, essential for everything from medical information to grocery shopping. Internet access is a byword for opportunity: information is shared online, learning happens online, jobs are advertised online… essentially, if you’re not online, you’re missing out.
The correlation between digital exclusion and social exclusion is well established, yet 3.7 billion people worldwide still do not have full access to digital technology. Closing the digital divide is a moral imperative, but it also poses a new conundrum for the tech industry: how do we mitigate the environmental impact of 3.7 billion new digital consumers, connecting everyone, everywhere without catastrophically accelerating climate change?
It’s clear that the tech roadmap can no longer focus on increasing processing power alone. Squeezing more performance out of the same chips remains a top priority, and performance per watt is where it’s at. But it’s more than just watts – it’s also energy, the amount of power consumed over time.
Both are important but in most situations, one will be more important than the other. Datacenters as a whole are limited by their ability to dissipate heat energy, while individual servers have power limitations. Mobile devices are limited by the energy stored in their batteries, while their instantaneous power is limited by thermal constraints.
A sensor running off a solar cell can tap into vast amounts of available energy, but can typically generate only a small amount of power. Power, energy, and heat constraints limit all our computing devices – our goal at a product level is to pack more performance per watt into those limits, while globally it’s to rein those limits in.
Koomey’s Law: a study in computational efficiency
In some ways, this isn’t a new concept. Koomey’s Law, coined in 2010 and named for Stanford professor Jonathan Koomey, describes a trend in the number of computations per joule of dissipated energy. This number doubled every 18 months from 1945 to 2000 (100x per decade), then slowed, doubling every 2.6 years or so since (16x per decade).
Koomey’s Law reflects the prevailing metric of computing power – peak efficiency initially then, more recently, typical use efficiency – and also the relatively static nature of power budgets. We expect our devices to have a certain power profile, even as Moore’s Law increases their compute capability.
That inflection point 20 years ago is most likely a consequence of the decline in Dennard Scaling, which happened around that time, as chip operating voltages stopped scaling at the same rate as transistor dimensions, meaning that designers had to work for energy efficiency, not get it for free as a by-product of device scaling.
Koomey’s Law is arguably more relevant to the way consumers experience computing today – and the way we should be constructing tech roadmaps. Our digital life tends to span multiple devices, where battery life and performance per watt are more important than gross performance alone.
This is reflected in the priorities of manufacturers who are increasingly focusing their efforts on increasing efficiency. And a ruthless focus on efficiency is what’s required if we’re to reduce overall power consumption: the need to focus on decarbonizing compute has never been greater.
Performance per watt: a new paradigm
Moore’s Law and Koomey’s Law are not laws of nature but observations on technology direction, and we can use them to see where things might be headed. Extrapolating Koomey’s Law, for example, we would expect devices to continue to become increasingly power-efficient, with processors so low powered that they could harvest energy from their environment.
To some extent, this is a technology that already exists. Passive RFID tags, for example, don’t use batteries; each tag harvests electromagnetic energy from the device used to read it. Typically, these tags are just information stores, but embedding a very low-power or even batteryless MCU capable of managing wireless data – as achieved by Arm’s Project Triffid – could give them the smarts they need to react to data on the fly, paving the way for billions of sustainable devices that bring intelligence without adding strain to the grid. (Battery-less tech also brings environmental benefits, since fewer batteries implies less mining and less e-waste.)
The optimization of workloads to capitalize on ultra-low-power processing is also gaining traction through movements such as TinyML, which centers on the optimization of machine learning (ML) workloads, enabling them to run on just milliwatts of power.
And where demand goes, product will follow: at the end of last year, Arm launched the Arm Ethos-U65, a microNPU specifically designed to accelerate ML inference in embedded and Internet of things (IoT) devices, providing up to 90 percent energy reduction for ML workloads on IoT devices.
As we begin to close the digital divide, sharing the benefits of connectivity with billions of new users of technology, this relentless focus on efficiency will become ever more vital. If we’re to avoid catastrophic climate change, keeping power and energy numbers stable is not enough; we must work to ensure that they actively decrease, reducing energy consumption and lowering emissions wherever compute happens. Performance per watt must become the new paradigm, guiding product roadmaps that extract an increasing amount of performance from an ever-decreasing power envelope.
The tension between technology as a solution to our environmental problems and an exacerbating factor is not new, but it’s critical to the future of our planet that its net contribution is positive. Maximizing performance per watt is part of that – and it’s something that has been in Arm’s DNA since the beginning – but it’s also finding the places to tighten the power envelope of all our systems, computing equivalent results more efficiently. And it’s looking at what those systems do – could better compute capability make their overall function more efficient, the way a smart thermostat can lower gas consumption?
As a provider of foundational technology, Arm is uniquely placed to reduce the environmental impact of compute while maximizing its benefits. We can empower our ecosystem to drive up performance per watt and drive down emissions, helping technology stay on the right side of history as part of the climate solution and a more sustainable future.
Our Sustainability Vision
Connectivity cannot come at the expense of our planet. To minimize the environmental impact of our technology, we aspire to leverage our expertise in low-power compute to do more work per Watt, providing a unique opportunity to drive up connectivity while driving down carbon consumption.