Jump to content

Adventures in Virtual Production


AJ Young

Recommended Posts

Since the industry shutdown all of my live action work, I decided to learn a new skill set and dived deep into the world of virtual production. I said it in college and still believe it today that narrative film will increasingly move towards virtual production. Big budget films already utilize a ton of CG environments and the Mandalorian showed us how incredible real time rendering on an LED wall can be.

As a primer, I found this doc from Epic games gives a fantastic intro to virtual production: https://cdn2.unrealengine.com/Unreal+Engine%2Fvpfieldguide%2FVP-Field-Guide-V1.2.02-5d28ccec9909ff626e42c619bcbe8ed2bf83138d.pdf

Matt Workman has been documenting his journey with indie virtual production on YouTube and it's becoming clear how easy it is for even the smallest of budgets to utilize real time rendering. I wouldn't be surprised to see sound stages or studios opening up that offer LED volumes and pre-made virtual sets at an affordable price for even micro budget features.

Here's what I did so far:

render-comparison-06022020001.thumb.jpg.16cf02039673281e66dd8be46b47380e.jpg

Attached are renders from three engines: Blender's Eevee, Unreal Engine 4, and AMD ProRender. (please excuse the heavy JPG compression, shoot me a DM for an uncompressed version)

I made the building meshes from scratch using textures from OpenGameArt.org and following Ian Hubert's lazy building tutorials. The man is a 3D photoscan from Render People. All of these assets were free.

Lighting wise, I was more focused on getting a realistic image over a stylized one. Personally, the scene is too bright and the emission materials on the buildings need more nuance.

  • Blender's Eevee:
    Honestly, I'm beyond impressed with Eevee. It's incredibly fast. For those who don't know, Eevee is Blender's real time render engine. It's what Blender uses for rendering the viewport, but it's also designed to be a rendering engine in its own right. Most of Eevee's shaders and lights seamlessly transfer over to Blender's Cycles (their PBR engine).

  • UE4:
    After building and texturing the meshes in Blender, I imported them into UE4 via .fbx. There was a bit of a learning curve and manual adjustments for the materials, but ultimately I was able to rebuild the scene. The only hitch were the street lamps.
    In Blender, I replicated the street lamps using an array modifier which duplicated the meshes and lights. The array modifier doesn't carry over the lights into UE4 via the .fbx, so I had to import a single street lamp and build a blueprint in UE4 that combined the mesh and light.
    In the attached image, the street lamps aren't in the same place in UE4 because I was approximating. My next step is to find a streamlined way to import/export meshes, materials, camera, and lights between Blender and UE4. I believe some python is in order!
    As expected, UE4 looks great!

  • AMD ProRender:
    Blender's PBR engine, Cycles, is great. However, it only works with CUDA and OpenCL. I currently only have a late 2019 MacBook Pro 16". It's GPU is the AMD Radeon Pro 5500M 8GB. Newer Apple computers only use the Metal API, which Blender currently has no support for.
    Luckily, AMD has their own render engine called ProRender. Needless today, the results are great and incredibly accurate. This engine isn't a real time. Render time for the AMD shot was 9 minutes. This render is definitely too bright and needs minutiae everywhere.

My final thoughts:

Even though this is a group for Unreal, I'm astonished by Eevee, particularly how incredibly fast it is for running only on my CPU. (Again, Blender has no support for Apple Metal, so it defaults rendering to the CPU on my laptop)

The next iteration of Blender will be utilizing OpenXR. According to the Blender Foundation, they'll slowly be integrating VR functionality into Blender and version 2.83 will allow for viewing scenes in VR. With that in mind, I'm definitely going to experiment with virtual production inside Blender.

As for Metal support, I believe Blender will be moving from OpenCL/GL to Vulkan in the near future. From what I've found, it's easy to translate from Vulkan to Metal. (This part is a bit above my head as a DP, so I'll just have to wait or use a Windows machine)

Does anyone have any useful guides that have a streamlined process for moving things between Blender/UE4? I'm from live action and love how easy it is to bring footage from Premiere/FCPX to DaVinci and back via XML. Is there something similar or is .fbx the only way?

 

  • Like 2
Link to comment
Share on other sites

On 6/2/2020 at 9:05 PM, AJ Young said:

Even though this is a group for Unreal, I'm astonished by Eevee, particularly how incredibly fast it is for running only on my CPU. (Again, Blender has no support for Apple Metal, so it defaults rendering to the CPU on my laptop)

Correction:

After further research, I discovered that Eevee can only run on the GPU (according to Blender's documentation). Eevee is in fact running on my GPU on my laptop, but it is running on the display GPU (Intel UHD Graphics 630 1.5GB). My MacBook Pro has 2 GPU's, one for processing graphics and one processing displays.

Regardless, impressive.

Link to comment
Share on other sites

I'd be curious to see something rendered on a serious render engine like Vray RT (even VRAY for Unreal), Arnold or Renderman. Something with some real physically based shaders and raytraced lighting and shadows. I remember Octane, a few years back, had a product they were trying to launch called Brigade, which they seem to have pulled the plug on, but it looked very promising. UE5 seems to have taken huge strides and could be a real viable option down the road especially if people like Lucasfilm are helping with the R&D.

Even in UE4, they showed the ability for a paradigm shift in film making altogether, not just backgrounds, but the ability to create entire animated films in real time. Check out Reflections which was made a few years back as a demo (I love the musak version of The Imperial March). They did a demo where the presenter actually places himself 'on set' and can move the camera around the CG characters and environment and all the lighting and reflections respond accordingly.

Edited by Phil Jackson
Link to comment
Share on other sites

3 minutes ago, Phil Jackson said:

I'd be curious to see something rendered on a serious render engine like Vray RT (even VRAY for Unreal), Arnold or Renderman.

AMD ProRender is a physically based renderer that only uses ray-tracing! ? It doesn't do real time, but from what I've learned neither does Vray, Arnold, nor Renderman.

Link to comment
Share on other sites

  • 4 months later...

Howdy!

I wanted to update on my recent experimenting with AI for animating (because I'm a DP who only wants to focus on camera/lighting).

Here's a video I whipped together showing my experiment with animation:

I wanted to test out and see if I could get motion capture data from pre-existing video. VIBE (and a handful of other programs) utilize AI/Machine Learning to discover poses from video.

I took Fred Astaire from Second Chorus and ran it through VIBE. I converted the VIBE data to an .fbx file (common 3d modeling file type) and brought it into Blender. The armature didn't move around, so I had to manually match the location while the individual bones did the rotation from the VIBE program.

There was tons of clean up needed, particularly with getting the poses to work with my Batman model and making some of the moves more dramatic.

I feel this test really put VIBE through its paces because it wasn't an ideal video for it to guess poses from. Nonetheless, an interesting learning process!

[If you have trouble viewing the video on Twitter, shoot me an email: AJYoung.DP@gmail.com]

Link to comment
Share on other sites

  • 1 month later...
  • 2 weeks later...

Martin Frohlich demonstrated using Blender's real time engine, Eevee, in a virtual production environment with green screens. You can watch his talk from BlenderCon2019 where he discusses it here (start your video at 9:05 for virtual production):

A cliff notes of the video:

  • Frohlich made two films, each shot in two versions: traditionally on location and exclusively virtual via green screen. The goal was to make each version of the two films look identical. Frohlich wanted to see if they audience could tell the difference between on location and virtual production
  • The virtual production used Eevee as a real time, on set composite for the DP and director to view.
  • The virtual production was done on green screen with no LED volume. No composite was burned into the footage on set, they treated it more like a LUT; an estimate of what they'll expect in post-production.
  • They used a motion capture system to record the camera's position and rotation
    • However, they received a lot of "noise" during the real time capture of the camera's position and rotation because the infrared mocap system was also picking up the infrared light emitted from the lights used on set.
    • To see this "noise", look at the background plate on 19:20. It's bouncing around because the mocap data is being interfered with the real movie lights
  • Game engines like Unreal Engine(UE4) and Unity are much faster than Eevee, at the time of making his presentation in late 2019.

---

Personal thoughts:

One thing game engines do better than Eevee, currently, is baking in the lighting to the virtual set, known as lightmapping. Rather than calculate the light rays every frame, game engines instead paint the shadows and bounced light onto the virtual set ahead of time on anything that doesn't move, known as static. This means the computer only needs to render lighting, in real time, onthings that are dynamic, like a moving light or moving character.

UE4 and Unity perform lightmapping faster than Eevee and the tools for doing so are built into their engines. In fact, Eevee doesn't do lightmapping, it's actually performed by a different render engine in Blender, Cycles. Cycles isn't real time and is more comparable to Pixar's Renderman, Maya's Arnold, etc. This means it takes longer for Blender to render lightmaps in comparison to UE4/Unity. If a DP/director wanted to change some aspect of the virtual set lighting, it'll take significantly longer to render out the lightmaps in Blender.

Of course, to speed up lightmapping, Blender users can use addons like BakeLab 2. This speeds up the tedious process of lightmapping, but the choke point is still the Cycles rendering.

Eevee does use something that is like lightmapping, called irradiance volumes. Instead of painting the virtual set, Eevee places "probes" across the area to record what the lighting should do in terms of bounce light. This video best describes it: https://www.youtube.com/watch?v=9SD0of-mOHo The only downside to Eevee's irradiance volume method is that any lighting begins to flicker when you move the camera/things. This is because the engine is updating the shadows.

---

For those keeping track of these posts, CML's first web discussion of the year will be on virtual production. They're scheduled for Jan 6, 2021. I'll post a link to more details about it when CML does.

 

Link to comment
Share on other sites

Fun to see someone else approaching the mixture of physical and virtual reality, from a different direction than we have ?

What you see below were my first experiments with moving pictures (having worked before both with Unity and 360° photography, so all the needed equipment and software was already at hand):


Almost every video on this channel is an attempt at combining Unity-based real-time rendered surroundings with real-world capture, performed in a very DIY way, originally only to prove that it could be done (we did not expect the interest the project will gather).

Link to comment
Share on other sites

4 minutes ago, AJ Young said:

Very interesting! Did you shoot this in front of a green screen?

You can say that ?
Specifically, our green bedroom wall, using a yoga mat as a backdrop extension ?
+ I had those two slim 1m LED lights made to make the real world video match the in-game lighting a bit better ?

unknown.png

I forgot to mention that even though the camera is tracked in 6 degrees of freedom by the computer (so moving it will result in a corresponding perspective shift in the plate), we haven't recorded a single video with a moving camera yet, because of practical limitations, but we're gearing up to finally do something about it ?

Link to comment
Share on other sites

50 minutes ago, Phil Rhodes said:

What's the approach for camera tracking, then?

P

We use the same tracking system that monitors the position (and rotation) of the headset and controllers that you see Michalina use in the video.
In addition to those three objects, we add a fourth one (a controller from another set), that is fixed to the real-world recording camera, which is then recognized by the tracking sensors, providing its own position and rotation data in three dimensions, with no need for trackers or reference objects in the scene.

 

160.jpg

Link to comment
Share on other sites

Also Tomasz, am I understanding this correctly? You are getting the girl to play Beat Sabre, and filming her on greenscreen as you do so, then when you superimpose her on the computer image her motions are automatically in synch with the motion of the game because she was actually playing it?

Link to comment
Share on other sites

24 minutes ago, Mei Lewis said:

Tomasz, would you be able to go through what kit you use in a bit more detail?

A lot of people are using the vive tracker right, but but the thing on your camera isn't that?

The Vive Tracker that you linked serves a similar purpose in HTC/Steam's inside-out tracking solution (Vive headsets, controllers and trackers are equipped with light sensors that monitor where in the world they are by receiving flashes of lights from the oscillating illuminators, called Lighthouses, which you place in the corners of your space). Vive Tracker is exactly what I would be using if we were in the HTC hardware ecosystem.

Inside-out v Outside-in: How VR tracking works, and how it's going to change

In our case, since we're using first generation Oculus hardware, the tracking works outside-in (cameras/light sensors are placed stationary in the corners of your space, while the controllers and headsets blink their unique identifiers using IR LEDs), so in a very similar way, just the opposite direction.

New Oculus Touch Controller Leak Points To Finger Sensing, New Haptics

 

28 minutes ago, Mei Lewis said:

Also Tomasz, am I understanding this correctly? You are getting the girl to play Beat Sabre, and filming her on greenscreen as you do so, then when you superimpose her on the computer image her motions are automatically in synch with the motion of the game because she was actually playing it?

That is correct, with the caveat that the computer generated image has to be rendered from exactly the viewport (position + rotation + field of view) that matches the position of the real-world camera (establishing which is the purpose of the camera-tracking solutions described above). If I mess up the calibration between the two, it results in less convincing parallax between real life talent and VR-based objects, like at 1:53 in the video below:
 

 

Link to comment
Share on other sites

1 hour ago, Mei Lewis said:

I really want to try some of this, but don't know where to start. Is there a step by step guide somewhere?
 

For Oculus, there's
https://creator.oculus.com/mrc/
and here's the guide for LIV, which is sort of its SteamVR counterpart, although developed independently and available as an add-on tool:
https://help.liv.tv/hc/en-us


I'm sorry about the double posts, unfortunately this forum doesn't allow edits (or even addendums) after more than 5 minutes ?

  • Like 1
Link to comment
Share on other sites

  

I'm trying to figure out the best way for myself to approach the whole topic of virtual production.

My starting point is, I have some skills as a real life DoP, including camera movement, lighting, shooting etc. I'm also a very techy person, having been a programmer in a past life and having played loads of games, including some of the early Unreal games that lead to Unreal Engine.

I loved The Mandalorian, and that plus the fact that Unreal Engine/ realtime production is on the cusp of being technically possible, now seems like the time to explore all this.

So where to start... for me it's going to be learning as much about this whole area as possible.

There's no way for me to get access to the sort of screen volume that the Mandalorian etc. used, but I have cameras, lights and computers, and much of the software involved is free.

So first step for me is to learn Unreal Engine!

I'm already part way through a couple of courses about it on Udemy.
Now I've found this which seems like a better place to start:
https://www.unrealengine.com/en-US/onlinelearning-courses/virtual-production-primer?sessionInvalidated=true

  

Link to comment
Share on other sites

The more I look into this, the more I realise that it's going to be a fundamental part of *all* cinematography in future.

We all need to understand some elements of it, or get left behind.

In a few years directors will be able to make their whole film, in a draft CG form, using their laptop, as easily as using an NLE, without ever needing a real life cinematographer. And by the point they come to make the 'real' film, most of the artistic decisions cinematographers like to make will have already been made without them.

Link to comment
Share on other sites

  • Premium Member
18 minutes ago, Mei Lewis said:

The more I look into this, the more I realise that it's going to be a fundamental part of *all* cinematography in future.

We all need to understand some elements of it, or get left behind.

In a few years directors will be able to make their whole film, in a draft CG form, using their laptop, as easily as using an NLE, without ever needing a real life cinematographer. And by the point they come to make the 'real' film, most of the artistic decisions cinematographers like to make will have already been made without them.

I think you're right, but I also find myself struggling to care much anymore. I can play video games (I do play video games) and I want cinema to be real stuff. I'm largely alone in this, I know. Possibly it's because I actually spent a lot of time as a teenager drawing around people in very inadequate software on Amigas in order to do this sort of thing, including terrible attempts at camera tracking, and I spent that whole time itching to do things for real.

I appreciate it's great, what's being done on Mandalorian is apparently good (haven't seen it) but honestly, I'm just losing the ability to give a damn.

P

Link to comment
Share on other sites

I play games too, far less than I used to. But I prefer films without a lot of (visible) CG and I prefer filming 'real stuff' too.

At some point though, not very far off, even 100% 'real stuff' will be pre-vized in the computer before it's filmed for real. Unless cinematographers are involved in that pre-viz they won't have as much of a say when it's actually filmed.

 

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...