How ILM Achieved One Of The Biggest (Literally) Surprises In Avengers: Endgame

How ILM Achieved One Of The Biggest (Literally) Surprises In Avengers: Endgame

We’re talking, of course, about the Hulk.

Avengers: Endgame is not just the end of a 22-film saga in the Marvel Cinematic Universe, it’s the end of a pseudo-trilogy for Bruce Banner and the Hulk. Starting with Thor: Ragnarok, moving to Avengers: Infinity War, and ending in Endgame, fans have seen Banner go from only Hulk, to only Banner, and finally the perfect blend of the two as Smart Hulk.

The task of blending Bruce Banner and Hulk fell to Industrial Light and Magic, which was one of several effects companies to work on what’s already one of the highest grossing films of all-time. And while the company also worked on things like aspects of the final battle, New Asgard, the Quantum realm, and more, Smart Hulk ended up being the biggest challenge for the team.

“It’s definitely not just a one-to-one where Mark [Ruffalo] does the performance and it all gets remapped onto Hulk,” Russell Earl, the film’s VFX supervisor, told io9 on the phone recently. “There are definitely tools and steps along the way that you have to develop to get there…Ultimately our goal was to give the animation team ultimate control of the [character] starting with the performance that’s delivered by Mark.”

Before any of that could happen though, Earl and his team had to prove to Marvel and the Russo Brothers the balance of actor and superhero was even possible. So they did what any person would do: turned to the internet.

We downloaded some footage of a Spotlight interview,” Earl said. “It was just Mark Ruffalo sitting in a conference room, sort of a behind the scenes kind of interview talking about the importance of the film. Obviously, that film is more dramatic [so] we took that and then we keyed that performance onto Smart Hulk.”

In previous films, Ruffalo’s performance as Hulk was bigger and “a little more grunty,” according to Earl. But once everyone saw the Spotlight interview roughly placed on Hulk, they saw the ultimate potential of the character.

“Once we did the test we sent it down and [the filmmakers] were like ‘This is great,’” Earl said. “‘[Ruffalo] doesn’t have to overact or make the performance too big…We can get that very subtle performance and strive to be that mix of Mark and Hulk.’”

“That was sort of the awe moment for [them] in terms of ‘Well, now we know where we can go with the performance,’” he continued. “And we also gave them the confidence to actually expand out some of the scenes that he was in and add a couple scenes just knowing that we would be able to get a believable character up on screen.”

However, when the tests were done and real work began, Earl and his team realised the end result just wasn’t good enough. On set, Ruffalo was using mostly traditional performance capture methods, and ILM was using its go-to tools — but, as tends to be the case, things needed to be improved to achieve the best possible result.

“We started with, what at the time was, our kind of latest and greatest facial capture system and then we completely scrapped it and rebuilt it,” Earl said. “Every [movie] you’re getting better and better tools and by the end, you’re like ‘Oh if we only had this, if we only had that [at the beginning].’ So you’re constantly trying to improve upon it. That’s just sort of the nature of a lot of the folks that work here.”

Here’s where things get a little complicated.


ILM realised the problem with Smart Hulk had to do with ILM’s “snapsolvers.” That’s a name for the digital tool in the software that interprets the raw data captured in a camera and applies it to the digital model of the actor. “We were taking the solve from the dots that we got from the head-mounted cameras and then going back and comparing that to the original shapes to just make sure everything was in alignment and things weren’t off model,” Earl said. “Our traditional snapsolve that we had starting on the project didn’t have the ability to do that.”

The process begins with a “medusa capture.” That’s where an actor sits down with a bunch of cameras pointed at them and makes all kinds of faces. Those faces then go into the computer and, using a snapsolve, the computer should be able to make any other face based on those examples. But, like Earl said, their snapsolve wasn’t working well enough in this case. So, they fixed it by using a then-unrelated solver called “Anyma,” which Disney Research created to use for capturing body and facial motions during voice recording sessions on animated films.

ILM found by adapting that solver to their medusa captures, it produced the kind of high fidelity results they needed to hit that perfect balance between the face and body mannerisms which truly sold Smart Hulk.

“The character sort of developed as we were as we working on the film and they were adding scenes,” Earl said. “There was some stuff that we had that just felt a little angry…and we could go in and sort of dial the angry down a little bit or just work on making a performance a little bit more likable. And so there were those things that we could [now] do in animation. Banner could just be Banner and Mark didn’t have to go too crazy with performance.”

But the obstacles just keep coming. Now that the body and face were working right, the Endgame team decided to keep adding outfits to the character. Earl explained that, at the start, he thought there would be about three outfits. That number eventually ballooned to 10.

“In fact the first outfit we built I don’t think we ended up even using,” he said. “Sort of a hoodie and sweats was his first thing…every couple weeks we’d get a different outfit. He’s probably suddenly got a bigger wardrobe than I do.”

Whenever a new outfit was placed on the character, that meant adjustments had to be done on all the previous work in a scene.

“It’s funny because in different costumes we were going in and changing his proportions a little bit,” Earl continued. “How big his chest was, how big his calves or feet were, how big his hands were, just sort of relative stuff that you don’t notice in the shots, per se, but when you first do you notice ‘Wow the chest looks really big’ or ‘His hands look huge.’ And we’re sort of rebalancing that kind of stuff on a per shot basis so that it still has the same look and feel despite minor changes in different outfits.”

Then there were the glasses.

“His glasses were a little bit of a later addition as well,” Earl said. “He had them and then he didn’t have them and then it came back to having glasses. Obviously, everything is challenging but I think it had the largest impact in the lighting and eyelines…Sometimes we just had to go back and do a little bit of relighting or add a little extra balance or knock down some of the shadowing so that the animation still had the same read.”

Watching a movie like Endgame, we see a character like Smart Hulk, and we laugh and enjoy him. Rarely do we think about the amount of time and effort it took to bring that incredible visual effect to the big screen. And that’s exactly what people like Russell Earl want to hear: that the work was so seamless, you didn’t even notice it was there.

Avengers: Endgame is currently in theatres.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.