What if you’re a couple of guys in a LAB on a Friday and all you have is a 2m x 4m, 4mm (3.9) pixel pitch LED wall that you’ve been loaned by your mates at 80Six, looking at it, what could we get away with shooting on it?
That was this weeks OSF Friday Lab challenge. Usually, this particular panel is only used in our development studio for system testing, not for shooting. But what does an almost 4mm pixel pitch screen actually look like up close and personal; in a real cinema camera. It’s not bad, it’s not great either but if you take a look, it may be good enough for what you want to shoot.
As expected, Trying to not see the LED pixels in the camera, at this huge pitch size, is crazy hard, forget focussing anywhere within 4m of your LED, not gonna-happen you’ll see pixels in your final image, instead we had to position the talent at least 4m away from the screen, supporting the basic rule of thumb of having your talent 1mm = 1m from the LED (approx).
We then went wide open on the iris of our old RED zoom (2.9), we then hit max on the LED processors nits (to try and get as much light between the spaced out LED pixels as possible) but, then we had to ND filter up to counter the maxed out LED. Even then we struggled, a 2.9 could really do with being a 1.9.
Also if you ever need to use such a big pixel pitch as this for real, you are going to need a big shooting stage. What you see here was a total battle with focus points and the amount of stage area you would use with such a high pixel pitch LED (would be a potential 8m of wall clearance, if shooting reverse or in the round).
This means you need a lot more space when your shooting with higher pitch LED panels, as you need space to keep your DOF away from bringing the pixels sharp. All in all a good days shooting and we are happy with the results given the minimum crew, with no post, all shot and edited in a day, it still amazes us how efficient virtual production is at filming content at speed.
To sum up we’ll be sticking with panels with a pixel pitch of between 1.9 to 2.8 on-set, but this was an interesting little Friday project to see what we could cook up with what we had to hand. I think we may knock a few more of these out over the coming weeks, it was great to actually shoot something on our studio. Be sure to subscribe to our youtube channel for updates.
These unity plugins are useful for anyone looking to build tier own VP pipelines using the Unity 3D engine as their development platform.
The alternative to Unreal Engine, Unity has some very good things going for it. It’s early VP focused build was and has been adopted in many studios and the Unity development community has been busy too. Here are a number of third party Unity plug-ins for virtual production that are certainly worth looking at.
Here are some of the plug-ins we use in our Unity virtual production pipeline developments.
Make your LED look better with CTAA V3 Cinematic Temporal Anti-Aliasing.
With a single click achieves true Next-Gen Cinematic Render Quality in Real-Time. Offering an inherent film realism quality CTAA enhances game graphics without compromising performance and artifacts found in other solutions. No Unnecessary and Detrimental post sharpening filter required, CTAA always maintains clarity and crispness when stationary and while in motion.
Set up your camera frustum with Cinema Pro Cams – Film Lens & 3D Toolkit.
A professional set of cinema lenses that have had their lens intrinsics virtualised and turned into a set of virtual lenses. This plugin adds cinematic lens data to aid VP developers in creating virtual cameras.
Cinema Pro Cams is a Unity plug-in that adds a toolbox to aid in the creation of accurate, real-world cinematic/film cameras inside of your Unity or project (Unity 5 , 2017 or 2018 and above). The perfect tool for game developers or film professionals who want to make sure their projects have that Cinematic look and feel of real world film lenses that will help to further immerse the player or viewer in your story.
Point Full-Screen display output to any screen using Editor Window Fullscreen
Pop your editor window to any device including your phone, tablets or to your LED processor without leaving the editor. Boost your productivity by full-screening your favourite editor windows on your secondary screen at the touch of a button.
The OSF GODBOX ranked the 18th fastest computer in its class (out of 500,000 benchmarked builds) powering real-time VFX and LED stage virtual productions. The newly released GODBOX Pro MKII is the first OSF computer to ship to studios with the new proprietary OSFX architecture, OSF’s answer to optimised on-set computing with direct to display technology.
Power each 4K LED Canvas with GODBOX™ Pro MKII
With direct support for Unreal Engine, Unity 3D and Omniverse. The new build also marks On-Set Facilities move away from Intel’s i9 to AMD Threadripper along with an upgrade from the NVIDIA Quadro RTX8000 to the latest NVIDIA A6000 GPU. Full system spec.
Before we go any further we would like to thank the teams at iMAG Displays and Treehouse Digital for inviting us to come down to their LED stages to test the power of the new GODBOX™ in action. Also a huge thanks to Brompton, RED Digital Cinema and Ncam for supporting these tests. May we also just clarify our ranking methods. We use multiple rendering benchmarking tests ranking the GODBOX™ against a worldwide sample of hundreds of thousands computer builds. Benchmarking is vital, but at OSF we believe the only real way to test our new products is on-set.
Testing the GODBOX Pro MKII On-Set
It was a fantastic opportunityto show the full power of our latest GODBOX Proon the big stage. On the day, the OSF virtual productioncomputer powered both thecamera tracking and 3D UE4 virtual environmentssimultaneously to 2 x4k LED canvases.Driving low-latency, HDR real-time rendered graphics to a32m x 3m, 2.8mm pitch volume,powered by a totalof just 2 synchronizedGODBOX Pro computers.
Proving our latest GODBOX Pro build on-set, capable of pixel perfect mapping, delivering low latency 4K outputs at 120fps to each 4K LED canvas, we benchmarked the system and found at 4K we still had a whopping 80% of computing headroom left for artists to utilise in engine for real-time sets and VFX.
Director of Virtual Production Asa Bailey @ On-Set Facilities 2021
In the first GODBOX™ the i9 was chosen for it’s stability at lower cores, but with the latest advances in the OSFX architecture, OSF reports production ready stability has now been reached on all 32 cores and 64 threads powering 8K UHD HDR output from all major engines.
DrivingtheiMag’s Black Pearl and Black Diamond ROE LED panel walls, the OSF engineered virtual production system deliveredblisteringly fast and reliable 10bit, HDR, ACES pipeline,real-time raytracing support.With both inner and outer camera frustums running all at full res, with no loss of frames or reliability,delivering a constantsynchronized 24FPS in editor. The results, using the OSF GODBOX on-set computing platform, with native Unreal Engine or Unity 3D support and no middle ware, the results speak for themselves.
What You Need to Know about the new GODBOX™ Pro MKII
GODBOXDirect LED Processor Support
Connecting the GODBOX Pro 4K outputs directly to the iMagstudios Brompton SX40 LED processors,without the need for any third-party media servers or software, the simplicity of the GODBOX system proved to not only reduce overall system latency, it also allows users to use the power of the Brompton SX40’s LED processors directly to adjust the brightness,colourand temperature of the LED, giving users the ability to adjust the LED to any camera sensor.
Coincidently, as the RED brand name suggests on an LED stage we have found that RED cameras do indeed see a littlemore red on the RGB scale. That’s compared to the ARRI camera systems that seem to seemore of the blue / green in the RGB curves. With GODBOX™ directly driving 4K HDR content to the Brompton SX40 LED processors, we were able to instantly adjust colour levels of the LED and pull out any RBG colour from the sensors vision at ease.
Go Directly from Your Engine to Your LED Processor
But no matter what camera system you are using, thanks to how the GODBOX virtual production computers connects your engine directlyto the LED processors just as if they where just a massive big screen 4K, 8k or even 16KK TV or monitor,GODBOX users can reduce their system lag, and deliver graphics to the LED processors at up to 120FPS as standard.
Book a GODBOX Demo at Your Studio
OSF is proud of what we were able to accomplish considering our whole crew was under ten people and we had only 1 day to install and tune the OSF GODBOX system to the iMagLED stage. If you’d like to take a look at the GODBOX in action in your studio, drop us a line at [email protected]
On-Set Facilities CEO and VP Cinematographer Asa Bailey helms virtual productions both on-set and in the cloud working with some of the industries top VP crew and on-set VFX crew, and since the pandemic hit, he’s never been busier. In this post we sit down with Bailey to try and learn more about the role of the Virtual Production Supervisor.
The Virtual Production Supervisor Role
The Virtual Production Supervisor advises the producers and implements agreed virtual production solutions for the benefit of the production. Helping to minimize the production spend and schedule without compromising the vision or quality of the outcome, using virtual production technologies.The Head of the VP Department with a considerable influence over the whole production processworking in collaboration with the Director,DOP, Production Designer and VFX Supervisors.
The Virtual Production Supervisoris responsible for the reliability and functionality of the virtual production systems and workflows, and management of the virtual production both on-set and in the cloud. The grounding skills of the VP Supervisor are more tuned towards computer networking, camera tracking systems, LED and real-time lighting, on-set VFX visualisation, motion capture, virtual cinematography, game engine operation, real-time rendering, broadcast engineering, cloud computing and software development.
NVIDIA Omniverse Multi-user
A Virtual Production Supervisor understands the creative possibilities and technical difficulties in the processes of virtual production and is expected to have an expert understanding of the best ways to use the technology practically and creatively during every stage of production.
A Virtual Production Supervisor understands the production conflicts between budget, creative direction & schedule and can implement virtual production solutions that help to mitigate these conflicts. It is the Virtual Production Supervisor who will represent virtual production in all departments, assisting others to embrace virtual production technologies, workflows and services into their way of working.
During development and preproduction, the Virtual Production Supervisor is expected to offer proposals and suggestions toward the overall creative direction of a project, including suggesting new virtual production methodologies and processes and ways of doing things, propping technically minded ideas that may impart a creative benefit to the whole production.
During preproduction, the Virtual Production Supervisor is responsible for managing and supporting any multiuser connectivity to the collaborative cloud workflows including working with the VFX Supervisor to prepare assets for virtual scouting and providing technical support during virtual scouting sessions.
During production, the Virtual Production Supervisoris responsible for maintaining a stable real-time virtual production environment on-set and in the cloud. Including the reliable recording of all digital VP data, such as Engine Takes, FBX, FIZ, motion capture, and pre-vis masks & composites, as well as ensuring the integrity of data transfer to the VFX team and on-set digital information technicians.
During post-production, Virtual Production Supervisor is responsible for all recorded virtual production data and for coordinating and running any post-viz sessions using multi-user virtual environments, as required for post-production.
More Virtual Production Crew Roles
The Virtual Production Technical Director reports to the Virtual Production Supervisor or VFX supervisor and is responsible for the functionality of the virtual production systems and workflows, including the communication and management of the virtual production systems team.
During development and preproduction, the Virtual Production Technical Director is responsible for supervising multi-user virtual scouting sessions and for implementing and supporting the digital pre-visualization working environment.
The Virtual Production Technical Director is responsible for any multiuser connectivity to the collaborative cloud workflows including working with the VFX and 3D content development teams to assist in preparing assets for virtual scouting. The VPTD is also responsible for ensuring that their is sufficient technical support to assist any crew members who may need it, during on-set and cloud virtual production sessions.
During production, the Virtual Production Technical Director is responsible for maintaining a stable real-time virtual production environment on-set and in the cloud.
As well as assisting the Virtual Production Supervisor to invent and develop VP solutions,it isthe responsibility of the Virtual Production Technical Director toprovide a reliable VP system for the recording of all digital virtualproduction data, such as Engine Takes, FBX, FIZ, motion capture, and pre-vis masks & composites, as well as ensuring the integrity of data transfer off-set for any necessary post-production.
During post-production, the Virtual Production Technical Director is responsible for delivering all recorded virtual production data and for coordinating and running any post-viz sessions using muti-user virtual environments as required by all departments during post-production.
Virtual Production Developeris responsible for developing new virtual production software and hardware and its integration into virtual production systems.
Virtual Production Technicianis a specialist operator of virtual production hardware and software and its integration into virtual production systems.
Virtual Production Camera Assistantis responsible for the virtual production hardware at the camera end of any virtual production system.
System Networking Engineeris responsible for specifying and maintaining all network connectivity between virtual production hardware and software.
Virtual Production Produceris a producer who understands the tech and what it takes in time and money to execute. Responsible for budgeting VP systems for productions, studio installations, events and wherever VP crew and technology come together on-set and in the cloud.
Virtual Production Director A Virtual Production Director is a Creative Director who specializes in directing both on-set and in the cloud.Hollywood Directors who could be classed as Virtual Production Directors include James Cameron, Jon Favreau and Robert Zemeckis.
Virtual production is digital production 2.0
In the early 1990’s production computinghappened, and it happened locally(digital cameras, desktop publishing, rendering).Whereaswith virtual productionthese local processes now connectto the cloud,bringing all the benefits of cloud computing to every production that’s connected.
Where local and cloud computingcome together in computing terms is called the “Edge” and where the Edge exists itis disrupting not only the content production industry, butall industries.
At the Edge wherehumans, computing andhigh-speed networks combine they enable virtual worlds (services and applications)to be built. These virtual worlds areoften collectively referred to as the multiverse, metaverse or omniverse.
Virtual Production Technology can be quantified by its ability to aid and enable the action of virtual production. Virtual production technologies when combined create virtual production systems. These systems are designed to provide specific virtual production functions, workflows and services.
Asa Bailey CEO, On-Set Facilities
Virtual Production Director / Supervisor
Setting up Omniverse Unreal Engine Plug-in (Connector)
We take a first look at the NVIDIA Omniverse rendering platform and how to connect it to Unreal Engine. To start you’ll need to find and install the Omniverse launcher, Omniverse create and Unreal engine connectors. Theses are located in the “Exchange” tab underneath Apps and Connectors in the NVIDIA Omniverse launcher application that you download from here.
Once all software is set up (we run these tests on the OSF Godbox UE workstation with NVIDIA GPU) and you have your accounts made, decide on which machines are going to run what software. We suggest running each software (running Create app on one and UE on the other) on an independent machine.
On the Omniverse machine we’re going to create a local host server by going to the “collaboration” tab and selecting the “local nucleus collaboration service” settings option. This leads us to a web interface which allows us to control our local host and build new connections.
In the connections section of the web interface create a new connection called “LocalHost”. Then navigate to the same place on the Unreal Engine machine and select “Add” this time when it asks for a server name enter the IP address of the Omniverse machine and sign in with the same details.
Next launch Omniverse create and your Unreal Engine project on their specific machines. If the connector is correctly installed there should be a Omniverse Icon on the tool bar within UE. Under the icon is a drop down with the option to “Add server” in this box put the IP address of the Omniverse machine, this should prompt an engine notification stating either connected, or failed. This also creates a “Omniverse” folder in the content browser with your Omniverse servers IP and file paths.
Once both servers are connected, right click the assets or map you want to collaborate on and select “export to Omniverse”. Then select the Omniverse server’s IP in the Omniverse folder and where you want the selected assets to appear on both machines.
Open the imported map within Omniverse Create. Once the map is loaded and compiled, navigate into layerand select the grey cloud icon next to the root. This makes all assets within the map Live and editable on both machines. Activate Live updates on the Unreal machine by ticking the “Live Edit” option on the Omniverse dropdown.
We’re 98% there with a connection between both machines, live edits and shared assets. Now we just have to run the UE map as a USD. This is created automatically when exporting for Omniverse and is in the same place we exported the map previously.
Virtual production technology company On-Set Facilities™ have integrated StageCloud™ the worlds first sub £1m complete LED XR stage solution for filming in-camera VFX using tracked cameras and LED.
Aimed as a complete package for companies looking to own their own XR stage to shoot stills and video content the StageCloud™ solution is a complete 7m x 4.4m LED cave set up with ROE LED Diamond 2.6mm pitch walls, LED floor, Brompton Processors and a complete computing platform running Unreal Engine running on OSF Godbox™ computers.
OSF Founder and Virtual Production Supervisor Asa Bailey said “LED prices are still too high for many companies, but as with all digital technologies, LED prices will come down, especially as LED manufacturers will look to widen their markets, we predict this will bring LED and StageCloud™ technology to many more companies over the coming years.”
OSF see their cutting edge developments in virtual production similar to developments in Formula E racing, where developments trickle down to consumer electric car production. Bailey goes on to add “in time, we hope that our stage technologies will revolutionise how we interact with the metaverse as we enter an age of immersive experiences, StageCloud™ will bring immersive computing to every room”.
But for now, using StageCloud™ companies can invest in their own XR studio, to shoot on 3D rendered virtual sets. Any location dynamically displayed, matching any camera moves, bringing real-time ion-camera VFX to entertainment of marketing content alike, only virtually.
As real-time rendering engines such as Unreal Engine become even more capable of rendering photorealistic graphics, more and more content producers will capture real-world environments and spin them up on an XR stage. Bringing any outside location indoors and by extending the LED stages physical area, using virtual set extensions generated in a game engine, potentially massive 3D virtual sets can be created.
Support for StageCloud™ is provided by OSF’s own worldwide network of engineers supported by additional crews from component manufacturers and VP crews. Already orders for the new StageCloud‹™ XR stage package are coming in, with the solution proving popular with companies in gaming, fashion and interiors. The first installations of StageCloud™ are set to role out in the first quarter of 2021. Anyone interested in owning an XR stage can email the StageCloud™ design and sales team at [email protected]