New C2PA Standards Group to Fight Disinformation
A photograph of a ship, heavily iced over in a marina, circulates widely on social media captioned βUSS Al Gore Global Warming Research Vessel.β A national politician appears drunk at a press conference caught on camera. Former President Barack Obama appears on camera talking in part about his successor, Donald Trump, in uncharacteristic, profane terms.
All of these pieces of content appear genuine at first glance: genuine enough to go viral. Yet the United States has no vessels named after the former vice president and the video of the βdrunkβ politician was edited to make her seem inebriated.
And in the case of Obama, actor Jordan Peele appears split screen halfway through the video, speaking the same words in his own voice. Peele has participated in a deep fake video and leaves viewers with this message: βMoving forward, we need to be more vigilant with what we trust from the Internet. Itβs a time when we need to rely on trusted news sources.β
Inauthentic content harms public affairs, corporate business decisions, electionsβeven peopleβs health
In a world of increasingly accessible and powerful digital technologies, the rise of disinformation and misinformation in various forms is alarming. Content can be deemed inauthentic if it is deliberately misleading, fabricated or manipulated in various ways or comes from apparently genuine sources that have in fact been impersonated. In the wrong hands, inauthentic content harms public affairs, corporate business decisions, electionsβeven peopleβs health.
The number of content moderators on social media and fact-checking organizations has exploded in recent years and our collective ability to keep up with disinformation is stretched to its limits.
But while technology has enabled the rise of misinformation and disinformation in the digital age, it can and must help to minimize its spread and harmful effects, while ensuring that responsible technology through ethics, safety and security can enable sustainable impact in a connected world.
Arm has been at the forefront of defining many industry standards through the years and we understand that content authentication starts at the silicon level.
Arm co-founds new cross-industry coalition to fight misinformation and disinformation
Thatβs why weβre excited about Arm co-founding the newly formed Coalition for Content Provenance and Authenticity (C2PA). Arm joins Adobe, BBC, Intel, Microsoft and Truepic in forming this cross-industry coalition. Together, weβll address issues of misinformation and disinformation and work to establish guidelines and technical solutions toward that goal. This includes developing an end-to-end open standard for tracing the origin and evolution of digital content and ensuring an accurate record of any changes made to original content.
C2PA will work closely with organizations such as Project Origin and the Content Authenticity Initiative (CAI), which focus on implementation of content provenance standards and technologies. The Adobe-led CAI focuses on media capture, editing tools and the creative community to ensure content coming from any source can have provenance at its core.
Project Origin focuses on establishing and maintaining the provenance of content from trusted points of origin, including authentication of content through its travel and transformation through the news publishing ecosystem. Origin will champion the adoption of interoperable workflows between publishers.
Recognizing this common goal, the organizations formed the C2PA in 2021 to unify technical specification efforts under a single entity. The C2PA strives for a technology standard to be widely adopted across the web, client devices, and more broadly, anywhere people create or consume content, from professional cameras to newsfeeds on smartphones.
Our role in ensuring standards apply wherever Arm technology is used
Arm will provide its perspective from the bottom of the hardware stackβthe point of capture if you willβto ensure global standards are inclusive and adequate. We will look to ensure that these standards will apply wherever Arm hardware is deployed and add more specific technical expertise around security needs at the silicon level.
Weβre attempting to minimize the flow of misinformation by offering the opportunity for anyone publishing or accessing media via the internet to be able to demonstrate that it a) comes from where it says it has come from and b) is in the state the publisher intended.
Itβs important to be clear that we are not making judgements on the relative reliability of the content, journalist or publisher: others are working in this space. The standards that will emerge will ensure cryptographic integrity of the claims and assertions regarding provenance and authenticity and provide the means to embed those claims/metadata into the content items themselves.
The consequences of inaction
Weβve all chuckled at some early examples of fake content, whether itβs the Obama segment or this clever deepfake of actor-impressionist Bill Haderβs face morphing into Arnold Schwarzenegger as he impersonates the actor on a late-night talk show.
But weβve also read stories that donβt match the headlines or seem to come from a reputable source but actually donβt.
With each chuckle, however, comes some anxiety. We see how easy it is to spread fake content and mislead millions of people; we understand that there are serious societal consequences for this abuse.
Weβre excited about our participation and encouraged by the broad industry activity in this area, from C2PA to CAI and Origin and others.
Join the Coalition for Content Provenance and Authenticity (C2PA)
We invite Arm ecosystem partners and anyone concerned about these issues to join us in the fight against the dangerously destabilizing forces of mis/dis-information.
Please email membership@c2pa.org or c2pa-interest@arm.com for more information.
Any re-use permitted for informational and non-commercial or personal use only.