GODBOX Ranked Most Powerful Virtual Production Computer

GODBOX Ranked Most Powerful Virtual Production Computer

The OSF GODBOX ranked the 18th fastest computer in its class (out of 500,000 benchmarked builds) powering real-time VFX and LED stage virtual productions. The newly released GODBOX Pro MKII is the first OSF computer to ship to studios with the new proprietary OSFX architecture, OSF’s answer to optimised on-set computing with direct to display technology.

Power each 4K LED Canvas with GODBOX™ Pro MKII

With direct support for Unreal Engine, Unity 3D and Omniverse. The new build also marks On-Set Facilities move away from Intel’s i9 to AMD Threadripper along with an upgrade from the NVIDIA Quadro RTX8000 to the latest NVIDIA A6000 GPU. Full system spec.

Before we go any further we would like to thank the teams at iMAG Displays and Treehouse Digital for inviting us to come down to their LED stages to test the power of the new GODBOX™ in action. Also a huge thanks to Brompton, RED Digital Cinema and Ncam for supporting these tests. May we also just clarify our ranking methods. We use multiple rendering benchmarking tests ranking the GODBOX™ against a worldwide sample of hundreds of thousands computer builds. Benchmarking is vital, but at OSF we believe the only real way to test our new products is on-set.

Testing the GODBOX Pro MKII On-Set

It was a fantastic opportunity to show the full power of our latest GODBOX Pro on the big stage. On the day, the OSF virtual production computer powered both the camera tracking and 3D UE4 virtual environments simultaneously to 2 x 4k LED canvases. Driving low-latency, HDR real-time rendered graphics to a 32m x 3m, 2.8mm pitch volume, powered by a total of just 2 synchronized GODBOX Pro computers. 

Proving our latest GODBOX Pro build on-set, capable of pixel perfect mapping, delivering low latency 4K outputs at 120fps to each 4K LED canvas, we benchmarked the system and found at 4K we still had a whopping 80% of computing headroom left for artists to utilise in engine for real-time sets and VFX.

 

LED Virtual Production

Director of Virtual Production Asa Bailey @ On-Set Facilities 2021

 

In the first GODBOX™ the i9 was chosen for it’s stability at lower cores, but with the latest advances in the OSFX architecture, OSF reports production ready stability has now been reached on all 32 cores and 64 threads powering 8K UHD HDR output from all major engines.

Driving the iMag’s Black Pearl and Black Diamond ROE LED panel walls, the OSF engineered virtual production system delivered blisteringly fast and reliable 10bit, HDR, ACES pipeline, real-time raytracing support. With both inner and outer camera frustums running all at full res, with no loss of frames or reliability, delivering a constant synchronized 24FPS in editor. The results, using the OSF GODBOX on-set computing platform, with native Unreal Engine or Unity 3D support and no middle ware, the results speak for themselves.

What You Need to Know about the new GODBOX™ Pro MKII

GODBOX Direct LED Processor Support 

Connecting the GODBOX Pro 4K outputs directly to the iMag studios Brompton SX40 LED processors, without the need for any third-party media servers or software, the simplicity of the GODBOX system proved to not only reduce overall system latency, it also allows users to use the power of the Brompton SX40’s LED processors directly to adjust the brightness, colour and temperature of the LED, giving users the ability to adjust the LED to any camera sensor.

 

Coincidently, as the RED brand name suggests on an LED stage we have found that RED cameras do indeed see a little more red on the RGB scale. That’s compared to the ARRI camera systems that seem to see more of the blue / green in the RGB curves. With GODBOX™ directly driving 4K HDR content to the Brompton SX40 LED processors, we were able to instantly adjust colour levels of the LED and pull out any RBG colour from the sensors vision at ease.

Go Directly from Your Engine to Your LED Processor  

But no matter what camera system you are using, thanks to how the GODBOX virtual production computers connects your engine directly to the LED processors just as if they where just a massive big screen 4K, 8k or even 16KK TV or monitor, GODBOX users can reduce their system lag, and deliver graphics to the LED processors at up to 120FPS as standard.

Book a GODBOX Demo at Your Studio 

OSF is proud of what we were able to accomplish considering our whole crew was under ten people and we had only 1 day to install and tune the OSF GODBOX system to the iMag LED stage. If you’d like to take a look at the GODBOX in action in your studio, drop us a line at [email protected]

Changing ARRI Camera Setting in Real-Time with UE4

Changing ARRI Camera Setting in Real-Time with UE4

ARRI CAP the back door to ARRI Camera Settings, allows VP developers to control on-set cameras from the VP system inside Unreal Engine.

When I spotted this post about how to change ARRI LUTS live in-camera during recording / streaming. I thought about how the ARRI CAP (Camera Access Protocol) may be your gateway to controlling the ARRI Alexa from any device. 

When we drilled in a little deeper to see what I could find out about ARRI CAP and what else you could control remotely using the cameras API. The answer seems to be pretty much anything you can do in camera, at the on-camera menu you can do remotely using a long list of available call and set commands.

Mo-Sys

Mo-Sys Startracker Absolute Camera Tracking Solution

Integrating ARRI CAP API for Virtual Production Systems

If you shoot an optical layer of the actors on-set performance in-camera, in say a live composited virtual production you will have a virtual replica of the on-set camera in your engine, it’s called your virtual camera. The movement of the on-set camera is tracked using camera tracking technology and the camera operators performance is all captured and streamed to the virtual camera in the engine. The V-cam shoots the CG elements of the scene while the on-set optical camera shoots the real actors on set. These shots are then composited in real-time on the VP system.

It’s a challenge of virtual production, to calibrate the settings on both the optical cameras and the virtual in-engine cameras, especially in between takes and set-ups, and until we found this, impossible during takes. But with ARRI CAP it looks like we can not only control the on-set cameras LUT, but also the cameras exposure, FPS, ISO and a raft of other settings, all through an interface accessed on a web browser or an in-engine UI plug-in. This will enable our virtual production crew to quickly calibrate physical and virtual cameras and change any number of camera settings between and during takes.

External Links

ARRI changing LUT files during live streaming

https://ymcinema.com/2019/12/05/arri-revolutionizes-live-streaming-real-time-luts-mixing-to-elevate-cinematic-look/

ARRI CAP Settings

https://www.arri.com/resource/blob/32390/9b6f862d3a0c8893b6577530e74be324/18-06-18-cap-2-5-camera-access-protocol-feature-list-alexa-mini-amira-sup-5-3-data.pdf

Epic merges Unreal Studio and Unreal Engine in 4.24

Epic merges Unreal Studio and Unreal Engine in 4.24

With the release of Unreal Engine 4.24 preview Epic Games confirm some of many changes to be coming to the new version of the engine. One of these changes is the merging of Unreal Studio and Unreal Engine, Unreal Studio will be retiring after the release of 4.24. Users will still be able to use later versions of Unreal studio but won’t receive any further updates.

Unreal Studio

Unreal Studio was created to aid users in the industries of architecture, manufacturing, and product design. A huge feature of Unreal Studio which will become standard in Unreal 4.24 is the Datasmith toolkit. This will allow the aggregation and optimisation of 3ds Max, Revit, SketchUp Pro, Cinema 4D and CAD data within the engine.

The new integration of both engines will also enable all users to have more mesh editing features within the engine. This mesh editing will include static mesh editing, basic UV projections, jacketing and defeaturing optimisation tools, and a Variant Manager. This provides a faster and more optimised workflow for unreal users who were previously limited in mesh editing ability.

Twinmotion

Twin motion the Unreal Engine powered fast and easy real-time visualisation software for architecture, construction, urban planning, and landscaping has received an extension of its free trail. Epic had previously announced its free download until November 2019 but announced the extension to span to the first quarter of 2020. Once the software is downloaded the software will remain free indefinitely. Epic claims the coming versions will offer greater photorealism, improved assets, tools to facilitate collaborative workflows.

See Unreal Engine for updates and news on Unreal Engine 4.24

Unreal Engine On-set Crew Rates For Realtime Production

Unreal Engine On-set Crew Rates For Realtime Production

If you needed proof that realtime production is the future of 3D creation you need look no further than this report from Epic games. Realtime production is in high demand in many areas with a specific rise in the demand for skills in computer-assisted design, digital content creation, rendering, virtual and augmented reality, game engines, visualisation, 3D sculpting, industrial design, and film and video.

On-set things are just the same with realtime production skills in high demand for VR, realtime animation, virtual production and immersive productions. Take a look at these average crew rates for on-set crew members working in realtime production:

Realtime Production Crew Rates

Role

Advertising

Film / TV

DOP

£1200

£1800+

1ST AD

£550

£750

1ST AC

£450

£320

DIT

£400

£280

Motion Control

£480

£320

Model Animator

£450

£280

Gaffer

£550

£280

Commercial crew rates for realtime virtual production are on par with those in commercials (APA) and TV and Film production (BECTU) but with a shortage of crew with hands on realtime production experience, those that have can charge a premium or find themselves rolling from one job to the next.

These skills are in high demand in five identified industries, Construction and architecture, Design and media, Engineering, Information technology, Manufacturing and production. What you will also notice is that the best paid jobs are going to those with skills in Unreal Engine over any other software. 

blog_body_stats_jobgrowth.jpg

According to Epic Games Unreal Engine is the most important thing you can have on your resume. Its the fastest growing game engine in the market with jobs forecasted to go up by 122% in the next 10 years which is almost double that of their completion.

Unreal for Engineers 

With Unreal Engine leaking into every industry from automotive to robotics its not soon until everyone will have to have a brief understanding of the engine. Engineer’s will be hit by this new way of design and work, with the need of programming (or blueprints) increasing immensely. 

The New Wave of Realtime Computers

The 3D industry is getting ready for a new wave of production and rendering to support this increase in demand for Unreal talent. Recently OSF and Nvidia both released their own realtime machines which are optimised to cater to realtime needs. Unlike the old render machines which took hours to do seconds of virtual footage, that footage can now be done in just seconds.

Unreal Engine’s Last Hurdle 

The only feature of 3D design which isn’t currently in the engine is modelling and asset creation. Modelling is normally done within Blender, Cinema 4D, Maya, ect at the moment but with Unreal Engines rapid growth its only a matter of time until its developed to provide modelling tools internally. 

On-set lighting and reflections for Virtual Production in Unreal Engine.

On-set lighting and reflections for Virtual Production in Unreal Engine.

In this post Director of Virtual Production Asa Bailey gives his views and a comparison between various methods for creating realtime reflections on set and why he favours those methods that will still give him options in post.

Asa Bailey Director

 

 

Asa Bailey [email protected]
Director of Virtual production
On-Set Facilities

At this years NAB there was a bit of a battle brewing with some folks claiming that their new “final pixel” virtual screen walls would make green screen shooting and in fact post production a thing of the past. I don’t think so, but I have used both on set in virtual productions, I’d say they are both valid options for shooting. It’s about using the right tool for the shot.

In my view, the big problem with virtual wall technologies is that the footage is “baked” in camera and sold as “final pixel” just as they guys say in this video, and this leaves a lot less, if any options for layered work (post VFX passes) in post.

But to their credit Virtual Walls do give amazing reflections, to faces, eye’s and highly reflective surfaces (glass, paint, mirrors etc) as show in this demo from UE and Stargate Studios who’s done work on some big projects, so they do know a thing or two about VFX.

But they sing the praises of ‘final pixel” in camera is king for both creative and financial reasons. But come on guys, you know as well as I do that in the real world, in the reality of working in a studio pipeline, with studio Producers, Agencies, Talent and god knows who else who wants to have an input in to the final product, loosing the option to post process shots is too big a risk for Producers.

So would I shoot against a virtual wall, if the scene needed it and it would give the best result yes, would I use them to generate a realistic reflection on someones face who was say looking out of a window – yes I totally would and do. But on my sets they would be out of shot, so I could really use any sort of screen, it would not necessarily have to be connected to my virtual production rig and offer realtime prospective, a nice bonus but not essential.

Lighting for virtual production

Matching the lighting on-set to the virtual set in Unreal Engine.

But as a Virtual Production Supervisor or as a Director either way it’s my job to make sure that we leave the set with options, and thats why I shoot realtime in layers with a composite as an option. 80% of my shots are ready to grade and edit with minimum clean up in post. The other 20% need critical changes as often driven by other stakeholders (Agencies, Producers etc).

Shooting in layers.

Shooting in layers protects the Producers investment while cutting costs at the same time.

We shoot the talent layer (what the optical camera sees) and then we record the UE4 generated backgrounds as plates along with other realtime VFX layers on separate data layers, so that we can open them up later and VFX passes as required, indeed if required. We have the choice we can go to grade an edit with the composite files that are recorded on 10bit 4.2.2 with audio, ready to drop onto any edit timeline, or we can open them up and add more VFX passes, we can even regenerate UE backgrounds using our tracking (FBX) data, so if we decide that we don’t want that tree in the background anymore, thats ok, we simply remove it and then run the BG layer again shooting with virtual cameras in UE and export the shot and drop it back in our layer stack.

The Virtual Set

A neon lit tunnel in Unreal Engine that the cast will appear to walk down.

Realtime reflections are a challenge, in fact lighting in virtual production is a challenge full stop and you have to have done it many times to know what works, as they say practice makes perfect. My Gaffers and DOP’s have been lighting green screen sets for for years and they know where to put the light and how to spill just the right amount of light on a subject (optical) to make it match in the final realtime composite, but the point is if we need to, we can open up our recorded layers (optical, background, matt, foreground, composite) in post and fix any lighting issues. You may think this defeats the object of realtime production, but realistically realtime is a turbo boost to any production, it is not a golden egg laying goose. 9 out of 10 times you’ll still want to go into post production options especially if you are shooting for high end streaming shows, feature films or big brand commercials. For me, I want to shoot in realtime virtual production methods, but I also want the options to be able to take advantage of 100 year old VFX industry and all the amazing talent it holds.

Methods For Creating On-Set Lighting for Virtual Productions

Back to creating reflections in realtime, I prefer real lights rigged up on set to mimic the light that would be there in the virtual set. As you can see in the images below we had an actor walk down a neon lit corridor (the UE background) on-set we set up a number of lights to mimic the virtual set lights, so that our optical layer had inherited the scenes lighting. Having some physical lighting  on-set that matches as best as possible the light you’d get if you was on the virtual stage is important. In post it really helps if there is something close to work with.

Virtual Production System

On-set virtual production system and studio lighting.

If the reflections need to move like in a driving scene, I use 3 methods.

1 – Moving practical lights

We rig up stage lighting with bells and temperatures set to match the virtual world fixed to some sort of mechanical rig that will move the lights. Or we set up moving flags infant of the lights to mimic passing buildings etc.

2 – Screens

We’ve used multiple large LCD walls to generate a reflective images. Usually positioned out of shot (so we can still shoot for chroma) the screen projects a large layer of realtime light onto the scene, we’ll run the background on the screen from within UE or as a simple video file playing while we do the take.

3 – Projectors

We also use projectors projecting moving backgrounds, in fact this is my favourite way to cast realtime moving reflections on my cast. You can use scrims and diffusion to soften the projection, the light from the projection on it’s own is often very hard, but this gives you another creative tool to tweak and get just right for your look.

I hope that helps, as I say practice makes perfect and there is no one right way, it’s about the shot, the vision, the methods and obviously your budget, but any of the above approaches can be done on any budget, its just a matter of scale and complexity.

See the shot, craft the light, shoot.