Virtual Production, a next-gen workflow with a set of real-time tools for cinematographers, is combining real-time technology hardware, software to help filmmakers visualize their films on set, like Avatar, which was one of the first films using virtual production.
What’s more, there are plenty of visual effects films like The Avengers, Kong: Skull Island, Star Wars: The Force Awakens, Rogue One: A Star Wars Story, Doctor Strange, where the power of virtual production can come in.
Virtual production tends to be used to help visualize complex scenes or scenes that simply cannot be filmed for real, which can refer to any techniques that allow filmmakers to plan, imagine, or complete some kind of filmic element, typically with the aid of digital tools.
Virtual production tools in live action tend to relate mostly around the capturing of action, often with simul-cams or virtual cameras. These are effectively screens acting as virtual cameras that can be tracked just like motion captured actors are tracked, so that someone looking at the screen – a director – can see an almost-finished composed image of what the final shot may look like.
But it’s not just about being able to visualize these synthetic worlds, it’s also about being able to change them quickly, if necessary. Which is why real-time rendering and game engines have become a big part of virtual production, and why real-time rendering and live-action filmmaking and visual effects have started to merge.
The film The Jungle Book
Flimmakers can also use virtual production to control the environment being filmed in – typically a green screen with fixed lighting and cameras that can be easily tracked. There are several systems that allow for real-time graphics to be ‘overlaid’ over presenters and interact with them.
The ABC series, Once Upon a Time
In SIGGRAPH 2019,Fox Renderfarm was honoured to interview Aaron Daly, the Virtual Production Artists for Netflix, who shared his working experience and thoughts about virtual production with us. We were also honoured to have Leonard Daly, President of Daly Realism, to host an exclusive interview with Aaron.
Aaron Daly |
·VFX Technical Artist for Netflix |
·Virtual Production Artist |
·CBS Corporation Technical Artist and Previs Artist for CBS Corporation (2017-2019) |
·Technical Artist for Last Call Games(2014-2018) |
Works: |
Stranger Things (Netflix), Death Note (Netflix), Strange Angel (CBS) |
Leonard: Hi, this is Leonard Daly again, and I'm here with Aaron Daly. I'll let him introduce himself.
Aaron: Hi, my name is Aaron Daly. I am a technical artist and virtual production artist in real-time engines using most of the Unreal Engine. I currently work on a Netflix original series production, and I've worked for CBS and been on many shows before. I got my start in games, mostly working in VR. I started with VR educational content and then I moved to CBS to do VR experiences for TV shows and games. I worked on things like Stranger Things and Death Note, and a CBS show called Strange Angel, and a couple pilots that didn't get to earth. That is mostly my background and who I am.
Aaron‘s works
Leonard: SIGGRAPH has changed over the years and in particular what brings you to SIGGRAPH this year and what have you noticed these changes from previous years?
Aaron: Every year I feel that SIGGRAPH has a sort of theme. A lot of what I feel the theme of this year is has been a lot of virtual production technologies. You're seeing a lot of companies out there using Unreal, using real-time technologies to previz and find solutions for the television shows and movies. And that's why I'm here to kind of see the latest and greatest in technology for virtual production. And I particularly enjoyed companies like GlassBox who are making their own virtual production, virtual camera to kind of streamline the process through Unreal.
GlassBox‘s virtual production
Leonard: You mentioned that there's lots of new virtual productions this year in virtual production technologies, is there anything here that you see that's gonna particularly solve a problem, or perhaps and is there anything that has been extraordinarily difficult for you to solve recently?
Aaron: One of the biggest kind of problems is what I end up doing a lot is working in a previz environment. So, I'm trying to get the best-looking fastest thing out there, so that the filmmakers can have notes and come back to me. The whole point of using real-time technologies is that you don't need to wait around to render or anything like that. It's going to be right there, so one of the main issues for me and things that I've been trying to solve is how to get motion capture technology in the engine. There's a lot of companies out here that are using motion capture technology whether you use optical-based motion capture for movement like a OptiTrack or motion capture like that, or you use inertia-based motion capture like a Rokoko or an Xsens suits. I found that the Rokoko and Xsens suits type is more valuable because you're not gonna need to have a huge footprint to kind of get a setup up and running, it's gonna be much cheaper entry level than OptiTrack or anything like that.
OptiTrack’s motion capture technology
Rokoko’s face capture
Leonard: We're here at the Fox Renderfarm booth. Have you heard of Fox Renderfarm and have you had any use for that and in related to your real-time work?
Aaron: I have heard of Fox Renderfarm. Unfortunately, I've never personally had any use for it. Working a lot of real-time technologies, I am not needing to use a lot of render farms, but I hear the great thing. They do great work.
Leonard: Is there anything in particular you'd like to say to the people of Shenzhen or any place else about your experiences here at SIGGRAPH?
Aaron: The only thing I'd have to say is the future of virtual production is gonna be amazing, and I'm really excited for what it has to offer.