

Thanks to the efforts of other researchers and engineers, including Microsoft Research security expert Jay Stokes and Azure Media Security lead Andrew Jenks, we developed a solution we refer to as the Authentication of Media via Provenance (Amp), a blueprint for authenticating the provenance of media content. The early working sessions with the initial team members were just a start. We spent hours together drawing on a whiteboard, brainstorming ideas and performing attacks on potential solutions before we came up with a pipeline of technology and techniques that we had confidence in. I challenged the team with a question: Can we build an end-to-end pipeline that could authenticate the identity of the source of audiovisual content and assure, over the transmission and greater life history of that content, that the “photons hitting the light-sensitive surface of a camera would be properly represented by the pixels on displays viewed by consumers.” Could such a “glass-to-glass” system accurately assign a “pass” or “fail” to digital media depending on whether content was modified beyond a set of acceptable changes, associated with normal post-production and transmission?Įarly whiteboard captured as part of notetaking on the Amp effort. I reached out to tap the expertise of long-term Microsoft Research colleagues: Henrique (Rico) Malvar, an expert in signal processing with a long history of contributions to rights management and compression technologies, Paul England, a security and privacy specialist who developed the Trusted Platform Module (TPM) technologies to encrypt devices, and Cédric Fournet and Manuel Costa, who led efforts on the Confidential Consortium Framework (CCF), an open-source framework for building a new category of secure, performant blockchain networks. We’d need watermarking to tag content, combined with strong security and a means of storing and tracking allowable changes to content over time. Shortly after my meetings in Davos, I sketched out a back-of-the-envelope solution to address media authentication and provenance. The formation of C2PA comes via creative problem-solving at multiple organizations, with innovative efforts occurring independently and together. This effort will require participation by global organizations with a desire to combat disinformation, consumers who want to regain trust in what they see and hear, and policymakers and lawmakers with the best interests of all of society as a top priority.

Together, we are a small but growing coalition with a shared mission to re-establish trust in digital content via methods that authenticate the sources and trace the evolution of the information that we consume. The standards will draw from two implementation efforts: Project Origin’s (Origin) efforts on provenance for news publishing and the Content Authenticity Initiative (CAI), which focuses on digital content attribution. The C2PA is a standards-setting body that will develop an end-to-end open standard and technical specifications on content provenance and authentication. Microsoft and the BBC have teamed up with Adobe, Arm, Intel and Truepic to create the Coalition for Content Provenance and Authenticity (C2PA). I’m excited about the progress in this direction, nourished by strong cross-organization collaborations. One important direction in the fight against disinformation is to develop and field technologies for certifying the origin, authenticity and history of online media, which we refer to as the provenance of the content. There are no easy answers, but several promising ideas have come to the fore. What could be done to address the risk posed to journalism and democracy by synthetic and manipulated media? How might we address the unprecedented challenges generated by the coupling of new forms of disinformation with viral sharing of content on the internet? The videos demonstrated how AI and graphics could be harnessed to generate persuasive, realistic renderings of political leaders saying things they had not said.

The director of a major news organization asked me that question point blank after we reviewed several deepfake videos together at a meeting I had arranged with him to discuss efforts being made by Microsoft’s Defending Democracy Program. Those words continued to echo as I returned home from the World Economic Forum in Davos in early 2019.
