With the demand for LED studios blowing up, this week OSF Founder and VP Cinematographer, Asa Bailey, points us towards some usfull alternatives to the ROE BP.2 LED panel.
5 more LED panels you should look at.
“All of these panels have the desired pixel pitch, brightness, blackness, and scan drive options that works for film and TV cameras, and they are also all good for hitting the REC2020 colour space,” says Bailey, DVP and CEO of On-Set Facilities. “All these panels are what you can also call open, as they support a wide number of processing platform choice, meaning you can choose your processing, and take advantage of mixing multiple platforms between multiple productions, based on their availability in the market.”
Some of those below panels get significantly closer to Rec 2020, where previously there were very few panels on the market whose colour gamut got close to the Rec 2020 spec, there are an increasing number today, manufactured with LEDs that can touch the deeper colours, especially if driven by smart dynamic calibration, so that the colour is not restricted by the default factory calibration. “
These 5 panels are good examples, of LED panels for in-camera VFX, in no particular order:
The OSF GODBOX ranked the 18th fastest computer in its class (out of 500,000 benchmarked builds) powering real-time VFX and LED stage virtual productions. The newly released GODBOX Pro MKII is the first OSF computer to ship to studios with the new proprietary OSFX architecture, OSF’s answer to optimised on-set computing with direct to display technology.
Power each 4K LED Canvas with GODBOX™ Pro MKII
With direct support for Unreal Engine, Unity 3D and Omniverse. The new build also marks On-Set Facilities move away from Intel’s i9 to AMD Threadripper along with an upgrade from the NVIDIA Quadro RTX8000 to the latest NVIDIA A6000 GPU. Full system spec.
Before we go any further we would like to thank the teams at iMAG Displays and Treehouse Digital for inviting us to come down to their LED stages to test the power of the new GODBOX™ in action. Also a huge thanks to Brompton, RED Digital Cinema and Ncam for supporting these tests. May we also just clarify our ranking methods. We use multiple rendering benchmarking tests ranking the GODBOX™ against a worldwide sample of hundreds of thousands computer builds. Benchmarking is vital, but at OSF we believe the only real way to test our new products is on-set.
Testing the GODBOX Pro MKII On-Set
It was a fantastic opportunityto show the full power of our latest GODBOX Proon the big stage. On the day, the OSF virtual productioncomputer powered both thecamera tracking and 3D UE4 virtual environmentssimultaneously to 2 x4k LED canvases.Driving low-latency, HDR real-time rendered graphics to a32m x 3m, 2.8mm pitch volume,powered by a totalof just 2 synchronizedGODBOX Pro computers.
Proving our latest GODBOX Pro build on-set, capable of pixel perfect mapping, delivering low latency 4K outputs at 120fps to each 4K LED canvas, we benchmarked the system and found at 4K we still had a whopping 80% of computing headroom left for artists to utilise in engine for real-time sets and VFX.
Director of Virtual Production Asa Bailey @ On-Set Facilities 2021
In the first GODBOX™ the i9 was chosen for it’s stability at lower cores, but with the latest advances in the OSFX architecture, OSF reports production ready stability has now been reached on all 32 cores and 64 threads powering 8K UHD HDR output from all major engines.
DrivingtheiMag’s Black Pearl and Black Diamond ROE LED panel walls, the OSF engineered virtual production system deliveredblisteringly fast and reliable 10bit, HDR, ACES pipeline,real-time raytracing support.With both inner and outer camera frustums running all at full res, with no loss of frames or reliability,delivering a constantsynchronized 24FPS in editor. The results, using the OSF GODBOX on-set computing platform, with native Unreal Engine or Unity 3D support and no middle ware, the results speak for themselves.
What You Need to Know about the new GODBOX™ Pro MKII
GODBOXDirect LED Processor Support
Connecting the GODBOX Pro 4K outputs directly to the iMagstudios Brompton SX40 LED processors,without the need for any third-party media servers or software, the simplicity of the GODBOX system proved to not only reduce overall system latency, it also allows users to use the power of the Brompton SX40’s LED processors directly to adjust the brightness,colourand temperature of the LED, giving users the ability to adjust the LED to any camera sensor.
Coincidently, as the RED brand name suggests on an LED stage we have found that RED cameras do indeed see a littlemore red on the RGB scale. That’s compared to the ARRI camera systems that seem to seemore of the blue / green in the RGB curves. With GODBOX™ directly driving 4K HDR content to the Brompton SX40 LED processors, we were able to instantly adjust colour levels of the LED and pull out any RBG colour from the sensors vision at ease.
Go Directly from Your Engine to Your LED Processor
But no matter what camera system you are using, thanks to how the GODBOX virtual production computers connects your engine directlyto the LED processors just as if they where just a massive big screen 4K, 8k or even 16KK TV or monitor,GODBOX users can reduce their system lag, and deliver graphics to the LED processors at up to 120FPS as standard.
Book a GODBOX Demo at Your Studio
OSF is proud of what we were able to accomplish considering our whole crew was under ten people and we had only 1 day to install and tune the OSF GODBOX system to the iMagLED stage. If you’d like to take a look at the GODBOX in action in your studio, drop us a line at [email protected]
The Creative Console by Monogram is a modular desktop device designed to improve productivity and creative workflows. The Creative Console allows users to work in a variety of applications with ease whether its editing pictures in Photoshop or video in PremierPro, Creative console works with all.
And now Monogram has entered virtual production with their latest Unreal Engine 4.26 plugin. The plugin is designed around reducing barriers between direction and crew on set, they aim to achieve this by allowing the connection between Monogram Creator software and Unreal Engine. The Creator software now allows for the mapping of modules to control in-engine features and assets whether that’s a virtual camera or dynamic light fixture.
Lighting can now be controlled via Monogram Creator allowing users to adjust intensity, pattern, and color through orbiters and dials. The creative console also spans into the virtual camera department with the ability to adjust camera setting through the simple spin of a dial. Focus, zoom and aperture can all be adjusted through Monogram. Once ready to record the monogram creator can also be mapped with its sequencer and take recorder integration, enabling all controls for shooting to be in one designated deck.
It takes just 10 minutes and a steady hand to install a Quadro Sync II card. With this card installed in your PC you can connect multiple GPU’s to run in-sync and begin synchronising your LED video wall for virtual production. If you have a Virtual Production PC with an NVIDIA Quadro GPU installed, you simply open up your machine, locate an available PCI slot, carefully place the card in the slot and then connect up the 6pin PCE or SATA power cable (you should have one already, but check that your rig has a spare) and then connect the Quodro Sync II cable from your GPU to the top of the Sync II card.
To install the driver, visit NVIDI driver hub. Here is where you can download all the latest drivers for all Nvidia products, navigate through the search function and select the Quadro series Sync generator II. Now you can download the Quadro Sync Series drivers. After installing the drivers move down to “Quadro advanced options” here is where you’ll be able to download the Mosaic Utility. Mosaic technology combines multiple displays or projectors into a single virtual display port. Now your Quadro is ready to go and can be seen within the Nvidia control panel as a device is all setup correct. Next time we’ll look at how to connect a cluster of Quadro Sync connected machines to control the different LED video walls in your set-up.
Installing a Quadra Sync II Card for LED Video Wall Synchronisation.
Virtual Production spreads over into many hardwares and softwares, whether you’re working with Ndisplay powering graphics on LED screens, or virtual scouting in Unreal Engine, VP covers it all.
But, at the heart of it all this is computers powering and enabling the whole process. The engineering of that core component can make or break a production. So we’re made a guide for which GPU’s should be powering your shoots and why.
Professional Grade GPU for LED display synchronisation
Every virtual production developer dreams of having the Nvidia Quadro RTX 8000 but it doesn’t come cheap this graphics card will set you back just over £6,000. The card is made and designed with the professional use cases in mind, such as running multiple display technologies on-set. It’s built to a server grade with reliability and performance at its core. So if you can afford one, stop reading and go buy one. Otherwise, here’s a rundown.
The Quadro RTX 8000
The Nvidia Quadro RTX 8000 is the top of the range server grade graphics card which is perfect for high end virtual production, especially for sync of display systems where frame accurate distribution is required, such as in-camera real-time VFX.
The card comes with maximum 48GB of ultra-fast GDDR6 video memory for high performance processing times, and 72 RT cores (yes 72) which will be harnessed by Unreal Engine allowing freedom in lighting and for raytracing in record times with its 11Giga Rays/Sec performance.
The Quadro RTX 8000 also comes with 576 tensor cores which are used in common applications like DaVinci Resolve and the Adobe creative suite, opening up your abilities on-set further than just for Unreal. So this card will rip through on-set rendering in Maya, C4D and post jobs too.
On-set its essential to have a server grade GPU with a Sync II port for frame accurate display rendering especially with LED. This is vital for the ability to genlock a render machine or node machines with all other equipment onset. Including the cameras, recorders and even audio equipment.
For example when working with LED we have to render each panel on a separate render-node (a PC specifically designed for the task). In order to avoid glitches and delays, the GPUs have to be genlocked to render at the exact same time.
The Nvidia Quadro 8000 is designed to a server grade standards
The architecture of the card itself is built with reliability and durability in mind, for example the way cards ventilation allows for air to be passed back through the case of GPU to outside and down to the way to cards power distribution and consumption, that is all rigged for hard-core use.
Designed for professional and constant on-set or in most cases server rooms, its advanced cooling allows for days if not weeks of constant use without any damage to performance. This is why we use Quadro grade GPU in our On-set GOD BOX™ machines which are also designed specifically to run all day under load on-set.
The Nvidia Quadro RTX 8000 is able to be combined with multiple GPU over NVLink this currently isn’t relevant to virtual production in Unreal Engine as UE doesn’t support mutli-GPU processing, although its a feature being worked on and is firmly on Epics to do list. Multi-GPU can be used in many other rendering application though, so its cool to have as a feature.
The RTX2080ti is a Developer Indie Level GPU for VP
The RTX 2020TI is a middle of the range consumer GPU option from Nvidia. This is a common choice for most folks looking to get into virtual production development, its got a decent 8GB GDDR6 video memory which will allow for semi-complex scenes to be rendered in real-time without too much of a hit in frame rate.
The 2080TI is realtime ray tracing enabled with 8Giga Rays/s
This GPU is a good virtual production development level tool but thats about it as far as professional on-set use goes, as when you open up a heavy level your FPS will seriously drop. Open up two view ports and you’ll be lucky to hit 60FPS in each. This is fine for testing and RND although it wouldn’t be taken on-set, not by us anyway, as it doesn’t have a sync port and can’t be genlocked with the rest of the onset GPU’s. On smaller indie productions with just one render engine and one display, this wouldn’t be so much of an issue. However a 2080Ti isn’t designed with the intense use cases as the Quadro range.
The RTX 2080ti have less engineered cooling systems where air is pushed out into the computer case itself, not out of the box this pushes more need for internal fans.
Hell, if you stacked a number of them in one box like many offline-renderers do for say Octane, you need to essentially build a jet stream of air to expel the heat from your case, other wise you’ll get performance drop as the temperature of your CPU and GPUs rises beyond your bios settings. AKA even more FPS loss and lag.
This GPU comes in at roughly £1,199 a fair price for the performance which comes with it and this GPU is as we say great for Independent set ups and development. It will do 90% of what a developer needs without any issues and as a basic UE Artist or Developer needs it will be fine in any studio or bedroom studio, but if you want to bring your machine on-set, go Quadro.
Here’s a rundown of the Quadros, prices, basic specifications:
On the 11th of December Intel released the first RealSense Lidar Depth Camera (L515), these new cameras are designed to measure depth and provide new tracking methods to the masses with the small price tag of $350 (Preorder is available now, their scheduled to ship in April).
The new RealSense L515 camera is able to provide accurate depth perception data from 25cm toan impressive 9m. The small camera also comes with built in RGB camera, a Bosch-made inertial measurement unit, a gyroscope, and an accelerometer. All this kit is packed into the small casing measuring a diameter of 61mm and a height of just 26mm, a incredible size for so much technology.
The L515 is optimised to work with motion blur through its internal vision processor which allows for a exposure time that’s less than 100 milliseconds. The RealSense has a 4 millisecond photon latency making it perfect for realtime applications. Both the lidar sensor and RGB camera can record up to 30 frames per second allowing integration to VFX and film pipelines. The lidar sensor’s field of view is roughly 70 degrees vertically and 55 degrees horizontally, enabling the capture of up to 23 million points of depth per second.
Unreal Engine Support
The RealSense is supported by most major platforms and engines like: Python, ROS, C/C++, C#, Unity, Unreal Engine and OpenNI. The 4millisecond latency, 23million point cloud make the RealSense camera perfect for realtime game engine development. The point clouds can now be rendered in Unreal Engine through the new LiDAR point cloud plugin. The low latency delay opens the possibility of use cases within virtual production with improved depth VFX for a fraction of the current price.