How blockchains can and can’t be used in authenticating video and countering deepfakes

Written by ShamirAllibhai | Published 2018/11/10
Tech Story Tags: blockchain | deepfakes | artificial-intelligence | fake-video | blockchain-deep-fakes

TLDRvia the TL;DR App

The merits of blockchain to Amber Video’s mission

A few years ago it became clear that Photoshop-ing of video, in a way that looks indistinguishable from authentic video, was quickly approaching. When it does, it will have a serious impact on visual evidence and specifically our ability to rely on them for facts and to make judgements. A world where video is duplicitous will delegitimize even genuine visual evidence and society will veer to the negative aspects of tribalism to help decide what is their version of “truth”.

Blockchains*, the cryptography-based technology that records transactions in a verifiable and permanent way, hold a lot of promise and will positively impact the Internet in years to come. But they are not the solution to this problem: writing manipulated video data to a blockchain does not make it legitimate. (In fact it may validate fake video by writing it to a blockchain because the nascent perception of the tech is that a blockchain is a “source of truth” and by extension, the data in it is “truth”. No, the only perception should be that the data is unaltered since being written to a blockchain.)

But blockchains can be part of the solution.

There are multi-stakeholder environments where aural and visual evidence are critical records for what occurred in the event of a situation. They can absolve and implicate and help neutral parties earnestly unpack what happened, weighing all the evidence, of which video is one important part.

Manipulating video with ease.

Take, for example, police departments (where officers wear body cams) and the numerous stakeholders that exist when a shooting occurs. These could include the officer involved, investigators, the person who was shot and his/her legal team, prosecutors, judge/jury, watchdog groups, the media, and the general public. Many of these groups serve as an important part of our checks-and-balances system: how can each of these stakeholders have confidence in the veracity of the video?

More specifically, how can each of these stakeholders have confidence in the veracity of the video in a world where exploited video exists?

Artificial intelligence technology known as deepfakes allows bad actors to create increasingly believable maliciously-altered video with decreasing amounts of technical knowledge needed to execute. Replacing politicians’ heads in pornography videos is a common stunt these days. When manipulating video, to make it realistically appear that someone said or did something that he or she had not, becomes as simple as applying an Instagram filter, we should all be highly skeptical of the non-fiction media we consume. We should not automatically trust that a video is authentic and has not been altered — unless there is a proven technological solution.

A solution that incorporates an immutable chain of custody.

Amber’s video authentication iOS app: sign up.

A solution to the above law enforcement scenario is where videos are fingerprinted at the source recorder, the police body cameras. These fingerprints, or hashes, are written to an immutable ledger or blockchain from the recorder itself as most modern police body cameras have a wireless connection. As the video is downloaded from the device, uploaded to the cloud, clipped and shared, each event is written to a smart contract (a transparent agreement that is part of many blockchains). This creates an audit trail of the file and rehashing throughout the process to make sure the file’s integrity permeates.

When the file is shared and played by a stakeholder, the fingerprinting process is rerun or rehashed, and the fingerprints are compared to the ones in the smart contract. Either they match or they do not match: it is binary. The video is authentic or the video has been altered.

Which would you trust more:

a) a file sitting on the cloud server of an organization with a vested interest in the outcome of a case, and within a system where numerous people have access privileges and whose security practices are opaque; or

b) a file stored in a decentralized system, the permissions of which are transparent, and whose fingerprints — which you can independently confirm yourself — and audit history are readily accessible?

As the stakeholders should have more confidence in the immutability of a blockchain than the security policies and practices of a biased party, the doubt in the legitimacy of the video will recede. A key premise of blockchain design is trust-minimization. It seeks to create a secure environment between people and things (like Internet-enabled devices) and allows third parties to confidently transact with each other.

Even for the organizations and departments recording the evidence: they don’t want to be tarnished by false accusations and lies. They don’t want to be on the receiving end of a faked video that purports their officers committed a violation that did not actually occur and was antithetical to their mission. And arbiters, as well as society as a whole, will need to be skeptical of body cam and bystander videos being shared that don’t have a valid chain of custody.

A frequent defense at future trials, if we haven’t implemented a solution by then, will be that ‘manipulating video is commonplace and the court system can’t trust the police or the prosecution as they are biased parties who recorded and held the evidence on their own insecure systems. Thus, the evidence should be thrown out.’ Defense lawyers may not even need to prove who supposedly altered a video: just the fact that fake videos exist will delegitimize authentic videos. Coupled with genuine questions of data integrity in centralized systems could be enough to dismiss video evidence. And without that video evidence, the case may get dismissed.

There will be bad actors on both sides of a controversial issue, each intent on skewing public sentiment in their favor. Bad actors will distribute faked videos via social media’s echo chambers, hoping it will spread, create controversy, generate outrage, and catalyze upheaval.

We must not allow this to occur. We need the right technologies, system designs, and incentives to prevent this. Evidence-based conclusions have been critical to the advancement of societies. Regressing to negative aspects of tribalism to make judgements will chip away at democratic institutions.

We can’t stop rapidly advancing technology and the people intent on wielding it for malice from creating fakes but we can prevent their impact.

By securely fingerprinting video at the time of recording and tracking those hashes through to distribution, a major source of doubt among stakeholders, when fake video is prevalent, will be removed. When bad actors manipulate video and distribute them online to sow chaos, reasonable people, guided by the pursuit of facts, will be able to confirm the legitimacy of a video and then focus on what actually transpired in a recorded scene.

And blockchain technology is one important technology in this process to create trust minimization (vs. trusting vulnerable systems), combat video manipulation, and preserve due process.

To sign up to try Amber’s video authentication iOS app: https://itunes.apple.com/us/app/amber-video/id1358190681

To find out more about Amber Video: https://ambervideo.co

*Note: there are many companies creating blockchain technologies, each with their own specifications and features. Not all blockchains are secure. Assume in this piece that the blockchain used is secure and immutable, as is the vision for the Bitcoin blockchain technology, the original blockchain which many others are based on and inspired by.’


Published by HackerNoon on 2018/11/10