Keep under 4mm pixel pitch LED for virtual production

Keep under 4mm pixel pitch LED for virtual production

What if you’re a couple of guys in a LAB on a Friday and all you have is a 2m x 4m, 4mm (3.9) pixel pitch LED wall that you’ve been loaned by your mates at 80Six, looking at it, what could we get away with shooting on it?

That was this weeks OSF Friday Lab challenge. Usually, this particular panel is only used in our development studio for system testing, not for shooting. But what does an almost 4mm pixel pitch screen actually look like up close and personal; in a real cinema camera. It’s not bad, it’s not great either but if you take a look, it may be good enough for what you want to shoot.

As expected, Trying to not see the LED pixels in the camera, at this huge pitch size, is crazy hard, forget focussing anywhere within 4m of your LED, not gonna-happen you’ll see pixels in your final image, instead we had to position the talent at least 4m away from the screen, supporting the basic rule of thumb of having your talent 1mm = 1m from the LED (approx).

We then went wide open on the iris of our old RED zoom (2.9), we then hit max on the LED processors nits (to try and get as much light between the spaced out LED pixels as possible) but, then we had to ND filter up to counter the maxed out LED. Even then we struggled, a 2.9 could really do with being a 1.9.

Also if you ever need to use such a big pixel pitch as this for real, you are going to need a big shooting stage. What you see here was a total battle with focus points and the amount of stage area you would use with such a high pixel pitch LED (would be a potential 8m of wall clearance, if shooting reverse or in the round).

This means you need a lot more space when your shooting with higher pitch LED panels, as you need space to keep your DOF away from bringing the pixels sharp. All in all a good days shooting and we are happy with the results given the minimum crew, with no post,  all shot and edited in a day, it still amazes us how efficient virtual production is at filming content at speed.

To sum up we’ll be sticking with panels with a pixel pitch of between 1.9 to 2.8 on-set, but this was an interesting little Friday project to see what we could cook up with what we had to hand. I think we may knock a few more of these out over the coming weeks, it was great to actually shoot something on our studio. Be sure to subscribe to our youtube channel for updates.

KIT

RED Weapon 8K
Lens RED Pro Zoom 17-50 2.9
Lighting ARRI Orbiter

GODBOX Pro
LED Infiled Model ER3.9
Brompton S4

Crew

Director / DOP Asa Bailey
Technical Director / Actor Ruben Bailey
System Engineer / Jordan Gough

Pipeline

Platform / GODBOX™ Pro
Software / UE4 / Ndisplay
Edit and Grade / Premier Pro 2021

Colour Grading in Unreal Engine

Colour Grading in Unreal Engine

Real-time colour correction enables on-set colourists working in Unreal Engine to provide Directors and Directors of Photography with real-time visualisations of what a grade will look like. Able to set the grade on-set and also re-adjust after the shoot by simply opening the Unreal project, on-set colour correction is making it’s way into realtime virtual visualisation pipelines.

Optimising 3D assets For Real-time Rendering in Unreal Engine

Optimising 3D assets For Real-time Rendering in Unreal Engine

If you are looking to create visual effects such as live compositing or benefit from real-time animation and filmmaking, you’ll more than likely want to create sets and assets in a game engine like Unreal Engine. Here’s a quick run down of the things you can do to optimise your graphic assets and virtual sets.

UPADTE: time and time again we receive UE levels for on-set virtual production that are under-optimised, so we thought we’d try and keep this post updated with hints and tips on how to optimise UE levels for on-set rendering and production.

See what FPS you are running at

First thing to do, show your FPS on your stats as you design your virtual set, so when you fly around your level and watch the FPS change in realtime, this is vital as you’ll want to keep your level running at 120fps so that you have plenty of header room to insert video and mask packets into your set up. Even if you are shooting at 24fps or 30fps the extra frame rate is required to packet in your mixed media video information, especially if you want to reduce any lag in your system.

Before Optimisation

Here’s an example of a badly optimised unreal engine set, look at the RED fps stats displayed on the scene.

After Optimisation

Here’s what the level runs when at 120fps the correct for virtual production, we found that the problem was an actor that was not optimised correctly, it was slowing the level right down, we deleted it, and boom back up to the minimum required FPS for virtual production.

Take a look at Simplygone

Following up on what Epic suggest on their documentation, Simplygone is used in the games industry for optimising 3D meshes and reducing polygon counts. It also has the ability to re-mesh and from the looks of things it does a tidy job of smoothing irregular vertex.

Follow what Epic Games Suggest, we advise anyone looking to get into real-time VFX to brush up on how game developers optimise their assets and levels.

 

Unreal Engine Helps Asa Bailey Deliver Real-Time Virtual Production

Unreal Engine Helps Asa Bailey Deliver Real-Time Virtual Production

Asa Bailey took the role of Director and DVP (Director of virtual production) on the short film series created for NTT Data. Each short film took just 4 days to produce; Asa was able to create such rapid productions with a pioneered new Unreal Engine based real-time Workflow. In this article we’ll walk you through this real-time ready Workflow and how virtual production took the production to new heights.

The series was created for a company called NTT Data, one of the largest IT services providers in the world with more than 120,000 employees globally. The series was created to show the future direction of NTT Data within each different industry sector, with industry’s spanning from automotive to healthcare. The series focused on individuals being sent to the future to be awakened on the future of their company. The clients wanted a very sci-fi aesthetic to the series and we archived this through fully virtual sets throughout the production.

Asa Bailey Director and Director of Virtual Productiuon

Asa Bailey Director and Director of Virtual Production on set.

Pre-Production

The pre-production on a real-time project like this was crucial, Because of the huge cuts in post production with virtual production methods we have to compensate with pre-production. There was a continuous back and fourth between Asa and the 3D team. Deciding animations, sets and colour schemes. The short films harnessed virtual and physical objects, meaning there was still a strong need for the arts department. In our workflow we keep anything touched or held by the talent a physical object, creating a augmented production between the real and virtual worlds.

Virtual Production in Unreal Engine

On-set Production

When on-set Unreal Engine really came into its own, supporting the production with its real-time composites and powering our virtual production system which used a group of software’s including Unreal Engine, Brainstorm’s infinity set and chroma hardware’s.

The virtual production system allowed us to previsualise the entire production in real-time, showing the talent composited within the virtual sets. This was then shown all over the studio allowing the crew and actors to see an idea of the finished product. The system also records in 4 different layers, a composite of just the talent on black, the optical layer, the chroma key and a final composite which is talent and virtual set. This allows total freedom for the production it gives the ability to go down the traditional VFX route with just the optical and a pre done chroma key, or what we did was take 90% of shots straight out of the system and into editing with little touch ups in post.

Another element of the real-time on-set production is the actual camera and optical layers. It’s essential that the virtual and real world lightings match, if there’s a flashing lamppost in the background of the virtual set that needs to be reflected on the real talent.

Virtual Production Unreal Engine

Next Step for Virtual production

New virtual production methods and technologies are constantly surfacing and with the rapid growth of Unreal Engine by Epic Games, virtual production is constantly expanding and providing new opportunity.

The integration of real and digital characters is a massive leap for virtual production. We are now able to have virtual characters and physical actors in the same environment at the same time, whether that environment is virtual or a physical set the digital character can be inserted to either. We can now do this through new real-time facial and motion capture.

As seen in the video below Mo-Sys engineering are bringing this vision into the present through their Unreal Engine plugin and 3D tracking. Here you can see how Mo-Sys bring together tracking, Unreal engine and motion capture to present a almost reversed virtual production with the only virtual asset being the character whilst everything else remains real from the studio to the presenter. The performance was all done in real-time.

Real-Time Lighting and Reflections

Lighting within virtual production can be a massive challenge but now we have the ability to adjust virtual and physical lights in real-time to get the perfect lighting on both, the real and virtual sets. This allows total lighting symmetry with the virtual world creating a more believable finished product.

The development of LED walls is rapidly growing with pitches moving from 2.5mm to 0.7mm in less from 5years. With a 0.7mm pitch on the LED screens its possible to shoot them as backgrounds and provide a totally real-time in-camera VFX option. With normal green screen shoots green spill is a issue for all CG artists but when using LED walls the spill from the screens is actually the perfect lighting for the virtual scene.

Unreal Engine 4.23 Supports Cinema 4D

Unreal Engine 4.23 Supports Cinema 4D

With the latest version of Unreal engine (version 4.23) it’s possible to support the leading motion graphics and animation application, Cinema 4D. The new support for Cinema 4D is done by the use of a DataSmith plugin, a feature of the free Unreal studios Beta.

Cinema 4D import to Unreal Engine using DataSmith

With this new level of integration, creators can natively import .c4d files directly into Unreal Engine with support for scene hierarchies, geometry, materials, lights, cameras, and baked animations. Cinema 4D’s ‘save for Cineware’ command allows users to easily bake complex procedural motion graphics directly into real-time scenes through the Unreal Engine Sequencer cinematic editor.

The support for Cinema 4D is massive for Unreal Engine allowing us to bring our assets from Cinema 4D directly into Unreal and iterate quickly to accommodate change without the need of render times. Companies like NFL media currently use pre-rendered motion graphics on live TV which limits how much they can change there and then but with the use of Unreal Engine you’re able to edit there and then live. This will make broadcasters move to the Unreal Engine to cut out long renders for a faster turnaround.

On-set lighting and reflections for Virtual Production in Unreal Engine.

On-set lighting and reflections for Virtual Production in Unreal Engine.

In this post Director of Virtual Production Asa Bailey gives his views and a comparison between various methods for creating realtime reflections on set and why he favours those methods that will still give him options in post.

Asa Bailey Director

 

 

Asa Bailey [email protected]
Director of Virtual production
On-Set Facilities

At this years NAB there was a bit of a battle brewing with some folks claiming that their new “final pixel” virtual screen walls would make green screen shooting and in fact post production a thing of the past. I don’t think so, but I have used both on set in virtual productions, I’d say they are both valid options for shooting. It’s about using the right tool for the shot.

In my view, the big problem with virtual wall technologies is that the footage is “baked” in camera and sold as “final pixel” just as they guys say in this video, and this leaves a lot less, if any options for layered work (post VFX passes) in post.

But to their credit Virtual Walls do give amazing reflections, to faces, eye’s and highly reflective surfaces (glass, paint, mirrors etc) as show in this demo from UE and Stargate Studios who’s done work on some big projects, so they do know a thing or two about VFX.

But they sing the praises of ‘final pixel” in camera is king for both creative and financial reasons. But come on guys, you know as well as I do that in the real world, in the reality of working in a studio pipeline, with studio Producers, Agencies, Talent and god knows who else who wants to have an input in to the final product, loosing the option to post process shots is too big a risk for Producers.

So would I shoot against a virtual wall, if the scene needed it and it would give the best result yes, would I use them to generate a realistic reflection on someones face who was say looking out of a window – yes I totally would and do. But on my sets they would be out of shot, so I could really use any sort of screen, it would not necessarily have to be connected to my virtual production rig and offer realtime prospective, a nice bonus but not essential.

Lighting for virtual production

Matching the lighting on-set to the virtual set in Unreal Engine.

But as a Virtual Production Supervisor or as a Director either way it’s my job to make sure that we leave the set with options, and thats why I shoot realtime in layers with a composite as an option. 80% of my shots are ready to grade and edit with minimum clean up in post. The other 20% need critical changes as often driven by other stakeholders (Agencies, Producers etc).

Shooting in layers.

Shooting in layers protects the Producers investment while cutting costs at the same time.

We shoot the talent layer (what the optical camera sees) and then we record the UE4 generated backgrounds as plates along with other realtime VFX layers on separate data layers, so that we can open them up later and VFX passes as required, indeed if required. We have the choice we can go to grade an edit with the composite files that are recorded on 10bit 4.2.2 with audio, ready to drop onto any edit timeline, or we can open them up and add more VFX passes, we can even regenerate UE backgrounds using our tracking (FBX) data, so if we decide that we don’t want that tree in the background anymore, thats ok, we simply remove it and then run the BG layer again shooting with virtual cameras in UE and export the shot and drop it back in our layer stack.

The Virtual Set

A neon lit tunnel in Unreal Engine that the cast will appear to walk down.

Realtime reflections are a challenge, in fact lighting in virtual production is a challenge full stop and you have to have done it many times to know what works, as they say practice makes perfect. My Gaffers and DOP’s have been lighting green screen sets for for years and they know where to put the light and how to spill just the right amount of light on a subject (optical) to make it match in the final realtime composite, but the point is if we need to, we can open up our recorded layers (optical, background, matt, foreground, composite) in post and fix any lighting issues. You may think this defeats the object of realtime production, but realistically realtime is a turbo boost to any production, it is not a golden egg laying goose. 9 out of 10 times you’ll still want to go into post production options especially if you are shooting for high end streaming shows, feature films or big brand commercials. For me, I want to shoot in realtime virtual production methods, but I also want the options to be able to take advantage of 100 year old VFX industry and all the amazing talent it holds.

Methods For Creating On-Set Lighting for Virtual Productions

Back to creating reflections in realtime, I prefer real lights rigged up on set to mimic the light that would be there in the virtual set. As you can see in the images below we had an actor walk down a neon lit corridor (the UE background) on-set we set up a number of lights to mimic the virtual set lights, so that our optical layer had inherited the scenes lighting. Having some physical lighting  on-set that matches as best as possible the light you’d get if you was on the virtual stage is important. In post it really helps if there is something close to work with.

Virtual Production System

On-set virtual production system and studio lighting.

If the reflections need to move like in a driving scene, I use 3 methods.

1 – Moving practical lights

We rig up stage lighting with bells and temperatures set to match the virtual world fixed to some sort of mechanical rig that will move the lights. Or we set up moving flags infant of the lights to mimic passing buildings etc.

2 – Screens

We’ve used multiple large LCD walls to generate a reflective images. Usually positioned out of shot (so we can still shoot for chroma) the screen projects a large layer of realtime light onto the scene, we’ll run the background on the screen from within UE or as a simple video file playing while we do the take.

3 – Projectors

We also use projectors projecting moving backgrounds, in fact this is my favourite way to cast realtime moving reflections on my cast. You can use scrims and diffusion to soften the projection, the light from the projection on it’s own is often very hard, but this gives you another creative tool to tweak and get just right for your look.

I hope that helps, as I say practice makes perfect and there is no one right way, it’s about the shot, the vision, the methods and obviously your budget, but any of the above approaches can be done on any budget, its just a matter of scale and complexity.

See the shot, craft the light, shoot.