Arm Newsroom News
News

The New Technology Enabling Theresa May’s AI Healthcare Revolution

As we talked about when we launched Arm Project Trillium earlier this year, the solution is to give devices the capability to run ML independent of the cloud by increasing their computational ability with dedicated ML and Object Detection processors.
By Dennis Laudick, VP, Go-to-Market, Automotive, Arm

By Dennis Laudick, vice president marketing, machine learning group, Arm

A year ago, a global survey of 4,000 people by Arm and Northstar Research suggested the biggest benefit of artificial intelligence (AI) would be medical apps that enable earlier diagnosis of serious illnesses.

UK Prime Minster Theresa May this week agreed, calling AI a “new weapon” for medical research that could cut deaths from cancer by 10 per cent within 15 years. She said it might also help people enjoy an extra five years of healthy life by 2035.

AI was the main headline focus but the British Prime Minister’s announcement was a welcome endorsement for ‘Machine Learning’ (ML), a subset of AI which is the driving force behind many of the current advances in healthcare applications globally.

ML enables a computer system to learn and infer information from a bank of data using what is know as a training model. For example, a model might be built by showing a computer thousands of photographs of cats. Then when the computer is shown a photo of a dog, it should have learned enough about what a cat looks like to raise its digital eyebrows.

Imagine this technique applied to a disease such as brain cancer. Show a computer a million MRI scans of different healthy brains, then start to feed it scans of brains containing a tumor; the machine will analyze each image, note the difference from it’s health brain ‘model’ and infer that this is something that should be red-flagged as inconsistent with a healthy brain scan.

The larger and more detailed the data set, the stronger the identification algorithm will be. A machine with tens of millions of images to learn from could potentially diagnose tumors with at least the same level of accuracy as a trained doctor.

A matter of record

The UK is arguably the perfect proving ground for this sort of technology as there is a overarching public healthcare body, the National Health Service (NHS). Each UK citizen has a unique NHS number and all their health data is matched to it. As the service becomes increasingly digitized with data going directly into an individual’s record, the ability to track health factors over a lifetime increases. If third party data can also be added, such as well-being and fitness, then the possibilities expand. There are, however, challenges that need to be addressed first.

Many current ML applications rely on a centralized AI training model. Voice assistants such as Amazon’s Alexa are a good example as they recognize someone is speaking but the actual human language processing is done remotely. This could present issues when applied to healthcare.

Devices that rely on a network or internet connection for ML heavy-lifting are not resilient to an internet connection outage. In addition, it makes them more vulnerable to cyberattack as they cannot be taken off-line and still function. If the devices are sharing identifiable patient data outside the healthcare system with a general cloud, there may also be privacy concerns. Furthermore, devices designed for use in the field need to be capable of full operation in remote areas where no internet connection or cell service is available.

Empowering the edge

As we talked about when we launched Arm Project Trillium earlier this year, the solution is to give devices the capability to run ML independent of the cloud by increasing their computational ability with dedicated ML and Object Detection processors. It what’s we call ‘ML at the edge’.

Machine learning can also be ‘federated’ – shared, potentially anonymously, with a training model in the cloud. This means new knowledge can be shared without also sharing the medical data used to learn it, negating privacy concerns.

Beyond the PM’s announcement, ML at the edge has broader healthcare applications that are already in use today. Arm-powered digital health solutions such as the Amiko Respiro inhaler utilize machine learning running on an Arm Cortex-M series microcontroller to monitor and analyze inhalation technique and frequency, and to enable support services. With more than 250,000 deaths attributed to asthma every year, it’s a great showcase for the power of edge-based ML to improve healthcare.

At the truly microscopic level, we’re even seeing devices capable of sequencing the human genome in real time without an internet connection, enabling potentially life-saving diagnoses of diseases such as Ebola to be made in the field. Biotech firm Oxford Nanopore’s latest device, the MinIT, employs six Arm processor cores to crunch through a strand of human DNA in under 24 hours. With a new Ebola outbreak now being reported in the Democratic Republic of Congo, it’s easy to see how lives would be saved by easier diagnosis of carriers.

As real-time devices such as Respiro and MinIT show, MLs already enabling a new wave of potentially life-saving devices outside academia. And as edge ML initiatives like Project Trillium enable ML processing at far lower cost and power than ever before, revolutionary new devices powered by ML are soon likely to find their way into every hospital, clinic and even home, becoming as commonplace a medical tool as the stethoscope.

Article Text
Copy Text
Article Images
Download Images

Any re-use permitted for informational and non-commercial or personal use only.

Media Contacts

Alex Harrod
Senior Director Public Relations, Arm
+44 7795 363057
Media & Analyst News Alerts
Get the latest media & analyst news direct from Arm

Latest on Twitter

promopromopromopromopromopromopromopromo