Blog

Virtual production and the cutting edge

editshare_admin

StageCraft_The_Mandalorian

Virtual Production rocketed to prominence with the debut of the Disney+ series, The Mandalorian. The team at Lucasfilm placed LED displays behind actors instead of green screens or blue screens. The technique combined video game tech with cutting-edge motion tracking breakthroughs. The result of this combination is known as the LED volume. Today, LED volumes are springing up in studios around the world. What does it take to incorporate this way of storytelling with your own films?

“Visual effects are becoming real-time.”

The big benefit of virtual production is the ability to see the environment on set instead of in your imagination. The actors can meld into their environment easier, and the cameras can capture the “final pixels” in real-time. Working this way requires more pre-production planning. That results in the pre-production, production and post-production teams coming together in ways never before seen.

The real-time aspect here is key. 3D sets can be built and then displayed on the LED volume. Then the camera’s movements are tracked and synced with the display. The frustum is the area of the screen that the camera sees. “Frustum” is one of the buzzwords in virtual production, so you’ll hear it a lot. This area is the part of the screen that gets rendered in real-time and allows for the parallax effect that you would see if the background were a still matte painting. The ability to recreate this effect is much of what makes virtual production so compelling.

LED volumes

The bank of high-resolution LED panels can be large or small. It could be a flat panel or form of a rounded shape with a ceiling and floor. The higher the resolution of the panels, the higher fidelity the image. In Everything Everywhere All at Once, LED panels were used to throw lighting effects on the actress. The filmmakers talk about using a very low-budget version of the technology, but they served the purpose of adding lighting effects to their talent. Even though the LED screens are at an angle, you can see the pixels in the video and recognize that there is a big difference between what they were working with and what is on the stage at Lucasfilm.

Unreal Engine

How do you get the 3D background world onto the LED screen? The most common application for building these real-time worlds is Unreal Engine. Unreal Engine rose to prominence as a tool for building environments in 3D video games. 3D artists can create landscapes that appear photorealistic in the background. Unreal can be used when the shoot is happening on set, or it can build the “first draft” of the background. Then artists may use tools like Houdini to create a final version of the LED volume.

Camera choice

Test, test, test. That seems to be the mantra when choosing a camera for your LED volume shoot. The crew at REMEDY tested out a bunch of high-end cameras and concluded that the RED Komodo and Sony Venice were two of the best.

At the end of the video, the testers commented that they were surprised that the vaunted Alexa 35 ($93K) and RED V-Raptor XL ($40K) exhibited some artifacts. While the $6,000 RED Komodo avoided artifacts like “screen tearing.” This is also known as the “Venetian blinds effect.” The Komodo features a “global shutter.” RED has a little case study on using Komodo with VP. The Sony Venice 2 lacks a global shutter but features an extremely fast readout speed of less than 4ms. (The Venice 1 was used in the test, but it clearly shares a fast readout speed as well) The RED V-Raptor appears to have been tested at the VistaVision size and 8K, which has a 25% slower readout than the Super 35 6K mode.

Another major visual issue is moiré. This effect is seen when a camera focuses on an area with fine detail, and it appears to have rolling lines. Blackmagic Design has recently released an updated version of the 12K camera with an optical low pass filter designed to mitigate this problem.

These kinds of details make virtual production really technical. There are so many factors that have to work together in harmony. It will be really interesting to see if LED screens and the controllers will become optimized for the cameras that are shooting them.

Tracking devices for cameras

An essential ingredient for production with an LED volume is a tracker for your camera movements. The Mo-sys StarTracker often gets the nod as the go-to tracker.

The tracker connects to your camera and communicates the camera’s location to the system. This data allows the frustum to display the proper image. New fluid heads are beginning to come out with tracking information and data connections as well. For instance, Cartoni introduced the e-Maxima encoded head. This feature helps communicate tilt and pan data to the system for even better camera position tracking.

Lens data

Another important factor in achieving the right look is lens data. The focal distance and T-stop information are critical factors for VFX. Cooke’s /i data and Zeiss’ eXtended data technologies record the focus and iris positions. Lens data can also be sent to a wireless lens encoder and funneled into Unreal Engine.

Zeiss recently acquired the camera tracking company N-cam, so it will be exciting to see how these technologies start coming together in a more integrated way. Hopefully, this will lead to some more streamlined camera rigs in the future.

Crewing a shoot

Even though virtual production is a new technology, it is key to have a crew who has both experience and a knack for integrating new technologies. Noah Kadner has been a leading voice in the field of virtual production, and he has a great LinkedIn course that will help you and your team get up to speed with the basics of VP. Noah describes the crew collectively as the “Brain Bar” or the “Volume Control Team.” Team members will do real-time composting, supervise the shoot, or work with the physical LED panels.

Lighting

One of the first questions that people ask is, “Can I use the LED panels to light my actors?” The LED panels can complement your primary lighting, but it doesn’t replace it. The LED panels will do a great job of imparting edge lighting to your subject. And that is a big relief when compared to cleaning up edges on a green screen or dealing with green spills on your actors from the light reflecting off of the green screen. Lighting manufacturers are now using wireless DMX controls to interact with the LED volume. Prolycht has shown off how their app works with their lights to match the ambient lighting of the LED volume. It works with a new adapter from Accsoon and an iOS device. This ingenious combination automates lighting in amazing new ways.

Processing all that data

Puget Systems offers a great summary of the kind of hardware needed to feed 3D worlds to the displays. They recommend AMD CPUs over Intel and the NVIDIA RTX 6000 Ada or GeForce RTX 4090 GPUs. When choosing the right GPU for your studio, they note, “If you are dealing with a large LED wall, then the extra VRAM and support for a Quadro Sync card will make the RTX 6000 Ada the clear winner. For a simpler setup, especially a green screen, the GeForce RTX 4090 will likely suffice and save you money.”

Production design and the art department

LED volumes combine digital and practical set design in new ways. You might have a physical piece of furniture in some scenes and then a double of that piece in the 3D world on the LED screen. There are many questions about where the foreground should end and the background should begin. And there’s a dance between who should take the lead when props and set decorations span these two worlds.

The positive result of this situation is that artists who work in production and in post get the opportunity to collaborate in pre-production in brand new ways. Directors can use VR headsets in pre-production to explore the scene. Cinematographers can plan their shots in pre-production and see that on the monitor while they are shooting instead of filling in a green screen with their imagination.

Conclusion

Virtual production offers exciting new possibilities, and it has plenty of technical challenges. It shouldn’t be viewed as a replacement for shooting on location. But it is an amazing tool for bringing the whole production into a single moment or creating a world that would be cost-prohibitive to craft physically. At the same time, AI technology designed to key backgrounds out in scenes without green screens or LED panels is advancing rapidly. All of that makes virtual production an exciting space watch and a genuine advancement in storytelling technology.

Header image credit: Wikimedia Commons

MediaSilo allows for easy management of your media files, seamless collaboration for critical feedback, and out-of-the-box synchronization with your timeline for efficient changes. See how MediaSilo is powering modern post-production workflows with a 14-day free trial.