We’ve covered how game engines are helping filmmakers, and how the next Spielberg could be making films in Unreal Engine, as well as taking the opportunity to stand on an LED Volume stage for ourselves, but the use of this technology is growing so fast it’s worth pausing and looking at how real time 3D and traditional filmmaking techniques are merging.
One of the studios leading the way on using real time 3D, and Unreal Engine in particular, in film production is Magnopus, a creative studio that has been pioneering the use of new technology for over a decade. The team’s most recent work on Amazon’s Fallout TV series, where artists from Magnopus used in-camera VFX, the real-time flexibility of Unreal Engine, to create a virtual production workflow to use with 35mm film.
We’ve already covered the surprising design influences found in Amazon’s Fallout show, but AJ Sciutto, director of virtual production at Magnopus, met with me to discuss how Unreal Engine 5 and new filmmaking techniques were used to bring Bethesda’s iconic video game world to life with a blend of digital and practical creativity.
UE5 is blending filmmaking techniques
Magnopus’ AJ Sciutto begins by telling me the workflow his team uses is “entirely 3D” and that the studio is a “real time technology company”, so from the outset the aim is to find new ways of making films. “For us, the ability to work in 3D, in real time in Unreal Engine, makes the ability to respond to notes much quicker, more iterative and dynamic,” he says.
Speed is something a real time workflow adds to production, and AJ explains how using Unreal Engine rather than a pre-rendered VFX pipeline enabled the team to “push harder” on aspects of production. While traditional VFX teams may use Unreal Engine to block-in shots and pre-vis, using a completely real time workflow is more adaptable.
AJ says: “They’re [using a] post-rendering pipeline, and that’s why it takes sometimes four or five weeks to turn around a change or change requests. But by working in Unreal, you get a bunch of benefits; obviously there are things like being able to change lighting and set design on the fly, you can’t get that any other way”.
For Fallout the Magnopus team provided creative input across the whole production, working with director Jonathan Nolan and the filmmakers to break down scripts into scenes and devise environments that would be best served by in-camera VFX. The team eventually worked on four sets – the farm and vault door scenes in Vault 33, the cafeteria setting in Vault 4, and the New California Republic’s base inside the Griffith Observatory. The shots inside the Vertibird also made use of the LED stage process – AJ reveals they kept the LED background detail a secret from the actors, so they’re reactions to Fallout’s world were genuinely one of surprise.
Magnopus has been working with virtual sets for over a decade and AJ points out the speed and adaptability of using LED Volume stages has increased since The Mandalorian, which was very useful on the set of Fallout as Jonathan Nolan “likes to be dynamic” and because everything in Unreal Engine is 3D and real time, “we can make those changes live on set if we need,” says AJ.
But it’s not as simple as you may believe, that dynamic inspiration needs to be planned, reveals AJ. He says: “Let’s plan for it. Let’s talk about it. Let’s design it. Let’s think about the variables and let’s determine what things are going to be ultra dynamic, where I can put the big shot, what things are going to take maybe five minutes to change”.
He adds: “The shoe leather shots of walk and talks, that’s a no brainer, we can do that all day, but if you want to start putting a crane in the shot or if you want to start doing dynamic lighting changes in mid-shot while the camera’s rolling or if you want to move set pieces, virtual set pieces in shot, yeah, those have to be worked out together beforehand so that you are not scrambling on the day, because those sets are not meant to be that dynamic.”
Merging physical and virtual sets
Virtual sets have come on leaps and bounds since The Lion King and now blend physical, traditional set design with LED volume walls. Working with production designer Howard Cummings and art director Laura Ballinger the Magnopus team had to decide which pieces of the set should be built physically and what would be enhanced by 3D scenery, extensions and Lumen-powered LED lighting. The team used traditional concept art and storyboards to coordinate on merging the two worlds and working out where the physical and virtual transitions would be.
“We found there is a clear way to make the content look more believable,’ says AJ who explains: “You have foreground actors, you have the mid-ground practical set design and you have this farther mid-ground virtual set, and then you have this far away background, virtual environment.”
Importantly, says AJ, “that blend between the mid-ground layers is what’s most important to allow for the kind of realism of the full thing, the full environment to shine through the lens”.
AJ tells me the team experimented with the play of light, colour and texture and the mix of materials in the physical sets, and over the years the team has discovered reflective surfaces really work the best with an LED Volume. “You get more production value if you can reflect the background on to your physical set,” reveals AJ
He adds: “So on Westworld every bench was shiny and reflective and on Fallout a lot of the materials were made to look a little bit rough; we add a little bit of sheen to it to get some colour and reflection from the LED wall [which] reflects back into it and makes it feel more real.”
The sets on a modern LED Volume Stage are designed to have layers of physical sets and virtual environments, which helps blend the set design directly with the lens creating in-camera VFX. It also means there’s a new sense of cooperation developed amongst the new and old filmmakers. “So there’s a language and a comfortability with the filmmaker and technology grips to get to the point where they can actually feel like they’re shooting on a practical set,” shares AJ.
The blend of mixing real and 1:1 scale virtual sets on Fallout is seen at its best, says AJ, in some of the small details many could overlook but subtly add to that cohesion the Magnopus team sort.
There’s a scene in the farm setting of Vault 33 where a battle breaks out and gunfire hits the virtual projectors screening farmland scenes onto the Vault’s walls – it’s a kind of LED Volume within an LED Volume – causing the film stock to burn. It starts in the middle of the wall and then rises and extends as an undulating flame that spreads across the wall.
“So we did all these studies to determine how film burns, how cellulose burns. What’s the colour reproduction? We shot celluloid film burning inside of a trash can, outside here in downtown LA. We shot references, we found video references online,” says AJ, laughing. “So that was not the most typical workflow for real time departments to create video inside of a real time game engine of cellulose burning.”
“The more challenging part was how to distribute across this giant LED wall and 12 render nodes all running in real time and actually make it feel really good, and have that light from the burning orange cellulose impact the footage being captured inside the Volume,” says AJ. “We had two dedicated visual effects artists working on that for probably two months to make that feel right”.
He adds: “I think the way that was represented in the final product looks fantastic, and so we’re very happy with that. It was definitely a new challenge for us, but I think it came out really great.”
Retaining the dynamic nature of film
Returning to the idea of planning shots, AJ tells me how the virtual sets of Fallout were built entirely in Unreal Engine 5, allowing the filmmakers to take advantage of ‘virtual scouting’ in VR to plan shots to block out scenes. The technology enabled the filmmakers to place cameras and characters in a shot, set the lenses and save the date to create a heat map of the environment, which enabled the set builders to decide the areas that need the most attention.
This level of detail and set-building in 3D first meant the “happy accidents of award-winning shots”, as AJ calls then can still exist when working digitally. Unlike a pre-rendered pipeline there’s a “dynamic nature” to 3D environments that enables director’s to explore a set. “You can move the camera around to any perspective and shoot with the parallax of having a tracking camera through 3D space,” says AJ, who explains the “organic nature of filmmaking” can still exist.
“That’s another one of the benefits we found in building our environments as 3D, is that Jonah likes to run and gun. He’s not somebody who likes to pre-vis the entire show before he walks on set. He likes to know what set pieces he has available and what environment variables he’s got, where lighting is coming from,” explains AJ.
Jonathan Nolan is not a director who likes to stick rigidly to storyboards, “he likes to be more dynamic and organic than that,” shares AJ, adding “so our set design had to accommodate for that because that’s where some of the most beautiful language from the filmmaker comes from”.
The kind of flexibility LED Volume offered the creative team on Fallout suggests a rebalancing of digital and physical filmmaking, where 3D virtual sets, real time lighting and a mix of physical and virtual spaces can enhance a shoot.
This isn’t to say AJ would recommend an LED Volume to every filmmaker, saying: “I would never suggest to a filmmaking team or to a filmmaker themselves, that LED walls need to be standard in their filmmaking language. LED walls serve a purpose or a very kind of distinct and bespoke set of limitations of set design that can help a filmmaker not have to shoot screen screen or visual effects. There’s a whole host of things that I could never suggest to a filmmaker to use a volume for.”
There’s a grounded-in-truth-nature to filmmaking that can often get lost when CG and green screen is used, and this has seen a rise in the ‘anti-CG’ debate (as Hugo Guerra explored his article ‘CGI in movies, what’s not to like?‘) which sees a separation of the Marvel-style movie making process and some of the more grounded, gritty and character based filmmaking.
An LED Volume approach that combines virtual and physical sets, makes use of the speed of in-camera VFX and has the scope for those organic ‘happy accidents’ filmmaking thrives on, feels like a third-way that embraces the traditional filmmaking process with state-of-the-art technology.
AJ adds: “When you find these set-pieces and you design something that does work for [LED] Volume, I think the output can be much, much more impactful than doing a green screen through a traditional VFX effects pipeline.”
The future is just going to get even more interesting as real and virtual worlds are merged, reveals AJ who hints at off-screen experiences for viewers. But for filmmaking, he smiles and becomes animated at the idea of using neural radiance fields, called Gaussian splats, and NeRF models, photogrammetry that uses AI to convert light data into point cloud data.
He explains: “You go to an exotic location, or somewhere that you can’t really bring an entire crew into, where you need to shoot for three weeks, it’s it’s too cost prohibitive to be able to bring a whole crew there for that long, so you can capture those environments and represent them back in an LED Volume in pretty incredible detail and still get all the fidelity of being able to move the camera to where you need it without being stuck to traditional 2D background photography”.
This new method has the potential to add a whole new level of realism to virtual sets, with true representations of light and texture, “a single moment in time,” says AJ before adding, “it’s somewhat 4D”.
Set against this prospect, the future of filmmaking as old and new techniques merge is going to be interesting. If you want to find out more about Magnopus; work on Fallout read the blog post on the Magnopus website.