Please upgrade to the new Edge browser, or use Chrome, Firefox or Safari, before continuing. Internet Explorer will not support the best shopping experience on the ADI Digital Branch site after March 12. close button

Virtual Production: How it Works, Technologies and Applications Explained

In the world of AV, technology is virtually limitless. The use of networks and other IT-centric technologies and practices have ushered in an unprecedented range of robust, flexible and scalable media applications across the commercial AV integration market. Additionally, as a result of the AV over IP (AVoIP) revolution, network connections are open nearly anywhere clients want to access content.

Ever-increasing computing power is also introducing another specialized technology platform to use in AV setups: virtual production. As the capacity of media servers increases, and display hardware costs decrease, this content production process is being used more and more for corporate, house of worship and higher-education applications.

With a rising interest from tech-savvy end users and a growing range of AV-ready hardware and software offerings as well as new training opportunities available, now is a great time for integrators to expand their skill set with virtual production. Furthermore, the demand for more dynamic content is continuing to grow, so there is real growth potential for AV designers and integrators who can offer their customers with the ability to add high-quality graphics and visual effects to meetings, marketing content, classes and worship services.

Even with limitless possibilities, the goal remains the same: What is the story the client wants to tell? And what are the best technologies for producing and distributing that story.

How it works

Virtual production technologies vary according to application and output. Within television and film production, where the focus is on what is captured by the camera, it’s often referred to as In-Camera Visual Effects (ICVFX). If you combine physical sets and Augmented Reality (AR), it’s regarded as Extended Reality (XR) or Mixed Reality (MR), which can enhance the immersive and interactive elements of hybrid streaming and live events.

But for the commercial AV world, virtual production technologies usually refer to “the use of this technology in a broadcast environment where sensor data and real-time rendering systems are combined with LED walls and mapped lighting to create a fully realized virtual world, in-camera, in real time,” according to Matt Ward, in an article for frame:work.

Virtual production blends physical video production techniques with numerous virtual capabilities to create dynamic media. In the highest-end cases, video game engines are used to design 3D photorealistic environments and render them in real-time so they can be explored by tracked cameras moving through the virtual world. It’s also possible to use 2D content. The virtual environment is displayed in real-time on large-scale video walls or projection screens that are used as backdrops for stages of all sizes and scales.

Basically, it’s like a much more advanced and potentially expensive green-screen setup— with the main difference being that the computer-generated, interactive background is viewed and captured live, in real-time, instead of added in post-production. So, anyone presenting inside the space actually feels surrounded by the virtual scenic design, instead of being asked to imagine that a giant green wall is a mountain range or a seascape or a spaceship.

Because of the immersive, live, interactive surroundings they create, these virtual stages can feel more “real-life” than virtual. This is helpful for presenters, performers and producers who can see something close to the final product right there, live, all around them. Except in cases where a layer of graphics or AR content is being added to the final product — those extra details will be seen only in the livestream or recorded output, unless the live-composited image is shown on Image Magnification (IMAG) screens or other monitors in the space, allowing the presenters to see and interact with all the layers as they are added in real-time.

There are many different ways the system can be configured, which will be appealing to AV integrators. Let’s look at some of the variables:

The volume and lighting

First, you need to set the scene with a large backdrop that will display the virtual environment. This is called “the volume.” The size, shape and media of this volume will vary according to budget, space, and most importantly, your customer’s goals. What are they trying to achieve? And what type of media will help tell the story that delivers that result?

If they’re looking for atmospheric imagery that will always be in the deep background of a camera shot or a live stage setup, projection might work. But if they want flexibility in how they’re shooting, or they want to add a lot of interactivity to their content, LED might be the way to go.

With LED, budget and the scale of the stage will determine pixel-pitch. Remember, there is a big difference between how a camera sees LEDs compared to the human eye. The camera will pick up on nuances and color problems that we can’t see. One of the biggest issues related to this is the moiré effect, where the tight pattern of lines between pixels on LED walls creates wavy lines in your video image. An example of this could be when you point a camera at someone wearing a checked pattern, and the video image ends up looking distorted because the camera’s refresh rate struggles with the tiny squares on the fabric.

There are things you can do to avoid moiré. But check with your camera and LED screen manufacturers first to find out if your setup will be compatible. Those conversations will be helpful too, because soon there might be more options – many LED display manufacturers are developing products that will work better with cameras. Also, make sure your video wall processor is powerful enough to keep latency low, so you can avoid sync problems between the content on the screen and the movements of the camera. LEDs also have the added benefit of providing some lighting support, and ceiling and floor displays can also help create realistic reflections on physical set pieces and presenters.

Cameras and sensors

You’ll need to calibrate color reproduction between the wall and camera — either a manually operated camera or a production-quality PTZ. Cameras are a whole other topic, as there are compatibility issues between which one you’ll want to use with certain LED screens. If the refresh rate of the camera is different than the screen, you can end up with visual artifacts or dropped frames.

Cameras are tracked using infrared markers or sensors, which follow the shot and shift the virtual background based on the position and movement of the capture device. Tracking technologies are continuing to improve, making it easier to present realistic and dynamic content.

Media servers

Media servers enable real-time content changes, which enhances interactivity and makes it easier to make certain adjustments quickly. This makes the whole content pipeline faster, making it feel more like a live performance instead of standard media playback.

When it comes to choosing between 2D and 3D, keep in mind that the answer isn’t always to do the most technically intense 3D workflow for the ultimate cool factor. Sometimes a 2D background is more than enough. Several video production systems on the market will readily handle 2D content, including virtual backgrounds and on-screen graphics.

If you want to go into 3D, then a game engine needs to be added to the workflow. These are used to create backgrounds and 3D environments that can be navigated — just the way a player would move through a game’s world. The background can move in any direction or velocity needed to allow the presenter to “move” through a space without physically stepping beyond the volume’s edges.

The size of the volume will vary — for film and television production, virtual production sets are gigantic. But even a small space can also be used, due to the miracle of set extension. That’s when you use a media server to stitch a virtual background in real-time outside the LED walls. So, if your presenter is standing in front of a 12’ wide x 8’ tall LED screen, you can make it look like they’re standing in a volume that’s twice that size, or more.

Remember, the real-life image captured in-camera is composited in real-time with the virtual scene for live streaming, broadcast or recording. So green-screen rules still apply — people in the room won’t see the extended set. And if someone steps beyond the volume’s edges, they will disappear in the video output. But wherever the audience sees the stitched-together image, either on an IMAG screen somewhere else in the venue, or on any other video platform, they’ll get the full experience.

Keep in mind, media servers are becoming more powerful all the time, so real-time content changes are much faster now. And those media server companies are also working diligently to make it easier to use these advanced systems. They’re creating accessible interfaces that merge screens, mapping and compositing together for ease of use.

Control

Once the content assets are loaded into a content management system, they can be modified for any presentation, such as swapping out logos and rearranging pieces to tell stories in a narrative that suits the purpose of a meeting, classroom exercise or message. For smaller streamlined setups where there is only one presenter with a single microphone the control of the full audio and video system is handled via presets on an iPad or a simple video production push-button interface.

AV applications

For AV integrators looking to dive into virtual production, there are many potential applications within a variety of verticals.

But for the commercial AV world, virtual production technologies usually refer to “the use of this technology in a broadcast environment where sensor data and real-time rendering systems are combined with LED walls and mapped lighting to create a fully realized virtual world, in-camera, in real time,” according to Matt Ward, in an article for frame:work.

Virtual production blends physical video production techniques with numerous virtual capabilities to create dynamic media. In the highest-end cases, video game engines are used to design 3D photorealistic environments and render them in real-time so they can be explored by tracked cameras moving through the virtual world. It’s also possible to use 2D content. The virtual environment is displayed in real-time on large-scale video walls or projection screens that are used as backdrops for stages of all sizes and scales.

Basically, it’s like a much more advanced and potentially expensive green-screen setup— with the main difference being that the computer-generated, interactive background is viewed and captured live, in real-time, instead of added in post-production. So, anyone presenting inside the space actually feels surrounded by the virtual scenic design, instead of being asked to imagine that a giant green wall is a mountain range or a seascape or a spaceship.

Because of the immersive, live, interactive surroundings they create, these virtual stages can feel more “real-life” than virtual. This is helpful for presenters, performers and producers who can see something close to the final product right there, live, all around them. Except in cases where a layer of graphics or AR content is being added to the final product — those extra details will be seen only in the livestream or recorded output, unless the live-composited image is shown on Image Magnification (IMAG) screens or other monitors in the space, allowing the presenters to see and interact with all the layers as they are added in real-time.

Corporate

Since the earliest days of virtual production, corporate and enterprise clients saw the technology's potential to distinguish their brand using marketing, sales, social media, meetings, live streaming and training content. Lots of companies already had advanced, broadcast-level studios in-house. So, it was only natural for them to make the leap to a production process that would make it possible for them to do even more within those dedicated spaces.

But the good news is, it’s not just the big companies that can do this. Virtual production spaces don’t necessarily have to be gigantic. There’s a lot you can do even in a 10’ x 10’ setup, thanks to set extension and all those in-camera visual effects. For a compact setup, use a basic 2D media server for playback, and opt for mostly stock backgrounds.

In terms of location, sometimes all you need is an empty office or conference room. Depending on the amount of space and budget, an array of volume options are available. It’s possible to do a one-screen backdrop, a two-walled corner setup or a three-sided LED setup. If you have a space as large as an actual broadcast studio, go for the full 360-degree surround with LED ceilings and floors. Install some production racks, switchers and editing bays, and now you’re repurposing underutilized office space for your customers to produce content that they would have outsourced or paid for studio time to create.

With virtual production, your customer can make meetings, product launches, training and recruitment, so much better by helping them to:

  • Produce backdrops that can change with the environment, and also provide lighting and other effects that will enhance any kind of video production.
  • For product-centric sales assets, create lifestyle videos, seasonal content, or a whole year’s worth of social media without a lot of set construction or flights to other climates and cities. If your customer can create the 3D assets to make it look like you’re in Chicago on a snowy day and then shift the scene to a beach in Miami, then they can make anything happen.
  • Save money on post-production, too. With everything captured in-camera, live, in one shoot, there’s no need to go back in and replace backgrounds, as you would in green-screen productions.
  • Make product launches and demos feel like real-life. Make it interactive, with a blend of physical props and virtual content. Maybe the presenter could be in front of a factory floor, showing the features of something fresh from the assembly line. Use a physical object and then add digital overlays to zoom in to the details. Or create any showroom environment that will help to tell the story.
  • Easily customize content with sponsor logos and other graphics, live and in real-time.
  • Create a simulation environment for training and development.
  • Elevate a town hall meeting into something different from all the other video meetings that employees attend every day.
  • Create identical rooms on either end of a video call for a more seamless environment.
  • Reuse digital assets for VR applications, simulations, or in a 2.5D application where the 3D asset can be explored via web browser.

Houses of worship

Houses of worship are an interesting application, because much of the production will actually be closer to a live performance. Rather than simply creating content that is built for the camera and/or the streaming audience, there will also be people in the room, seeing the LED walls and sets. Fortunately, to make the most of the in-camera visual effects, simply use the IMAG screens to show the composited images. All you need is a tracked camera and then you’re most of the way there for a virtual production setup. And of course, the augmented content will be great for online viewers.

Higher education

Colleges, schools and universities can use virtual production as a helpful resource for different subjects. It can be used as an instructional tool by theater, film and communications departments to help prepare students for the evolving workplace. Many universities are also building LED volumes to use for new creative technology and immersive entertainment departments.

And there are many additional possibilities including:

  • Interactive content that combines physical materials with 3D models that can be explored in real-time
  • Immersive simulation and training environments
  • Sports media, broadcasts, press conferences and team promotions
  • Recruitment and retention efforts
  • Collaborative and commercial opportunities for local industry, which might borrow or rent the space
  • Create a simulation environment for training and development

Conclusion

In this limitless era, the creative technologies of video game developers, mixed reality producers, animators, interactive programmers and generative media artists are playing a more significant role in traditional AV applications. This physical-meets-virtual video production process, once the domain for the filmmaking industry and massive-scale esports multimedia events that blend live audiences with streamed content worldwide is now being applied more and more by C-suite executives, professors and worship leaders. Contact our Systems Design team today to learn how ADI can help you with your next virtual production project.

close button