How Machine Learning Developed the Face of MCU’s Thanos

Written by jptuttleb9_8483 | Published 2018/09/02
Tech Story Tags: movies | machine-learning | marvel | mcu | thanos-face

TLDRvia the TL;DR App

The events of Avengers: Infinity War, especially those last moments of the film, have had a long time to sink in with the fans. Although, the climax is not as devastating a blow as it could be. It will almost certainly be reversed in the sequel.

The character behind all the death and destruction is the Titan and tyrant Thanos, who’s quite power hungry. And he seems to crave Infinity stones for the power they can bestow upon him. The extreme forces he wields are right at his fingertips. But perhaps these Infinity stones could be his undoing.

Thanos has made a few brief appearances in the MCU prior to Infinity War. Believe it or not, it was not Josh Brolin who portrayed the Titan in the end credits scene of The Avengers all those years ago. It was actor Damion Poitier who played the part.

It was several years later, in Guardians of the Galaxy, that Brolin stepped up to the plate and played Thanos through motion capture recording. If you follow the movie industry, you know the technologies employed by cinematography are always changing.

Thus, with ever-adapting CGI tech, the very look of Thanos has changed slightly from movie to movie. One of several visual effects companies working on Infinity War, Digital Domain (which worked on the live-action Beauty and the Beast) wanted to include more of Josh Brolin’s facial features in Thanos’ character. Marvel Studios started working with the company several months before the official shooting for effects testing. That’s where AI stepped in to help with the “movie magic.”

Masquerade, a new machine learning software, was used to obtain the desired effect. Between 100 and 150 tracking dots were attached to the actor’s face and then recorded by a pair of HD cameras. These are actually pretty low-quality recordings, but they serve their purpose.

The recordings are sent through the machine learning algorithm which contains a vast collection of high-resolution facial scans featuring a broad range of emotions. Kelly Port, the VFX supervisor who headed the Digital Domain team, said that Masquerade “takes that low resolution mesh and it figures out what high resolution shape face would be the best solution for that…Then it gives you a solution, and then we would look at that result. If it didn’t feel quite right, we would make a little tweak in modeling to adjust.”

The high-resolution facial data produced by Masquerade goes on Thanos’ computer model. If filmmakers don’t have Masquerade or another machine learning program, the task of altering facial performances may have to be done manually. That’s a tedious procedure.

In the end, Digital Domain had a hand in creating over 400 different shots in Infinity War. Dan Deleeuw, the VFX supervisor at Marvel Studios, said, “Doug Roble, the guy that’s working on that [Digital Domain] software said something along the lines of, ‘If you’re not using machine learning in your software, you’re doing it wrong.”

Machine learning programs are likely going to be the way of designing computer generated movie characters in the future. Doubtless, Masquerade or a similar program will be used for the fourth installment of the Avengers saga.

Perhaps in a few years, machine learning will be so advanced that the moviegoer won’t be able to tell the real from the unreal. In fact, many modern films are already close to succeeding.


Published by HackerNoon on 2018/09/02