The Creative Console by Monogram is a modular desktop device designed to improve productivity and creative workflows. The Creative Console allows users to work in a variety of applications with ease whether its editing pictures in Photoshop or video in PremierPro, Creative console works with all.
And now Monogram has entered virtual production with their latest Unreal Engine 4.26 plugin. The plugin is designed around reducing barriers between direction and crew on set, they aim to achieve this by allowing the connection between Monogram Creator software and Unreal Engine. The Creator software now allows for the mapping of modules to control in-engine features and assets whether that’s a virtual camera or dynamic light fixture.
Lighting can now be controlled via Monogram Creator allowing users to adjust intensity, pattern, and color through orbiters and dials. The creative console also spans into the virtual camera department with the ability to adjust camera setting through the simple spin of a dial. Focus, zoom and aperture can all be adjusted through Monogram. Once ready to record the monogram creator can also be mapped with its sequencer and take recorder integration, enabling all controls for shooting to be in one designated deck.
It takes just 10 minutes and a steady hand to install a Quadro Sync II card. With this card installed in your PC you can connect multiple GPU’s to run in-sync and begin synchronising your LED video wall for virtual production. If you have a Virtual Production PC with an NVIDIA Quadro GPU installed, you simply open up your machine, locate an available PCI slot, carefully place the card in the slot and then connect up the 6pin PCE or SATA power cable (you should have one already, but check that your rig has a spare) and then connect the Quodro Sync II cable from your GPU to the top of the Sync II card.
To install the driver, visit NVIDI driver hub. Here is where you can download all the latest drivers for all Nvidia products, navigate through the search function and select the Quadro series Sync generator II. Now you can download the Quadro Sync Series drivers. After installing the drivers move down to “Quadro advanced options” here is where you’ll be able to download the Mosaic Utility. Mosaic technology combines multiple displays or projectors into a single virtual display port. Now your Quadro is ready to go and can be seen within the Nvidia control panel as a device is all setup correct. Next time we’ll look at how to connect a cluster of Quadro Sync connected machines to control the different LED video walls in your set-up.
Installing a Quadra Sync II Card for LED Video Wall Synchronisation.
The ugly cousin of virtual production is system tuning, it’s not virtual production as such, but system tuning is required to make a virtual production system work. On-set, fast, accurate Virtual production system tuning a crucial part of the virtual production work-flow.
This post will also help other crew members to understand what the VP crew are doing.
From our side as the on-set VP crew, knowing how to integrate components swiftly, knowing what settings to change and how to optimise the system often between shots, thats our job. It’s the VP team working as a unit who are responsible for maintaining a low-latency, stable, performance of the VP system. If your VP crew are asking for access to the camera or lens gears, or if they need a few moments to adjust tracking, here’s what the VP crew are up to.
All systems suffer from bottle-necks, sync issues, networking errors and latency problems. Althoughsystem’s assembled from consumer grade components will give consumer grade results. Most consumer grade systems don’t support lens calibration, encoding or genlock / timecode. This inevitably leads to constant on-set firefighting.
A Professional system won’t 100% eliminate the firefighting and tuning process but its integrated softwares and hardwares provide tools to optimise tuning easily live on-set. A professional system achieves this stability and optimisation though its hard battle tested on-set components from its computer at the heart through to the absolute camera tracker. Although a professional system can cost in the region of £100,000+, this would typically be a full service system coming with remote/onsite training, setup and hardware/software support.
On-Set System tuning?
Every system suffer from the same issues regardless of its price. Your main enemies will be frame synchronisation across the system (hardware and software), tracking and chroma keying(real-time compositing). Each one of these will have to be checked before each shot, as the DOP runs across the set and the lighting, tracking perspective and location changes the system will have to be briefly tuned and checked. If not checked a slippy 2 frames, jittery tracking or hard-edge chroma can break any illusion of reality, ruining the shot.
Doing your system check in-between shots, is where professional systems come into their own. The professional systems are designed around on-set use and the optimised tuning, teamed up with a good operator drastically cuts on-set system tuning times, to the point where its almost done on the fly and just requires 5 simple sharp pans from the DP. System stability and checks with a consumer level system take much longer to fix,as they don’t easily allow for fast fix iterations.
Knowing where an issue is in your system is the key to fast on-set tuning. Knowing the difference between a tracking issue or a synchronisation by just looking at a monitor, you’ll need to be as fast with your diagnosis as you will with your implementation. Theless you need to worry about the more you can concentrate on the tuning issues at hand. This is one reason why we developed the OSF virtual production computers, specifically designed for real-time production computing, at least we know it’s never our computers that needs the tune up.
We are pleased to announce that On-Set Facilities™ has raised funding from Clwstwr an ambitious five-year programme to create new products, services and experiences for screen. Already in use by major studios and productions worldwide the funding will enable On-Set Facilities to develop it’s OSFX on-set operating system and deliver even more powerful and reliable computing on-set.
Virtual sets used for a TV commercial connecting crew and talent across Europe and the USA.
On-Set Facilities develops solutions that combine hardware, software and networking to bring production crews together on both physical and virtual sets. This is called virtual production and it is being rapidly adopted at all levels of the production industry. Coupled with a range of powerful OSF production edge computers, designed to deliver low-latency on-set computing power for real-time VFX and virtual production, OSFX StormCloud is a multicast VPN for virtual production designed to provide accurate, low-latency transport of video and data that aims to answer the technical challenges of real-time virtual production, at the computing level.
Working with Cardiff University and On-Set Facilities, Clwstwr will build on Wales’ success in making creative content by putting research and development (R&D) at the core of production. “We want to create a culture of innovation in the cluster which will move the screen sector from a position of strength to one of leadership, internationally.” With Clwster funding the Welsh technology company On-Set Facilities aims to further develop it’s OSFX operating system that aims to become the worlds most reliable and powerful computing platform and connecting the companies leading edge production systems, globally.
Virtual Production spreads over into many hardwares and softwares, whether you’re working with Ndisplay powering graphics on LED screens, or virtual scouting in Unreal Engine, VP covers it all.
But, at the heart of it all this is computers powering and enabling the whole process. The engineering of that core component can make or break a production. So we’re made a guide for which GPU’s should be powering your shoots and why.
Professional Grade GPU for LED display synchronisation
Every virtual production developer dreams of having the Nvidia Quadro RTX 8000 but it doesn’t come cheap this graphics card will set you back just over £6,000. The card is made and designed with the professional use cases in mind, such as running multiple display technologies on-set. It’s built to a server grade with reliability and performance at its core. So if you can afford one, stop reading and go buy one. Otherwise, here’s a rundown.
The Quadro RTX 8000
The Nvidia Quadro RTX 8000 is the top of the range server grade graphics card which is perfect for high end virtual production, especially for sync of display systems where frame accurate distribution is required, such as in-camera real-time VFX.
The card comes with maximum 48GB of ultra-fast GDDR6 video memory for high performance processing times, and 72 RT cores (yes 72) which will be harnessed by Unreal Engine allowing freedom in lighting and for raytracing in record times with its 11Giga Rays/Sec performance.
The Quadro RTX 8000 also comes with 576 tensor cores which are used in common applications like DaVinci Resolve and the Adobe creative suite, opening up your abilities on-set further than just for Unreal. So this card will rip through on-set rendering in Maya, C4D and post jobs too.
On-set its essential to have a server grade GPU with a Sync II port for frame accurate display rendering especially with LED. This is vital for the ability to genlock a render machine or node machines with all other equipment onset. Including the cameras, recorders and even audio equipment.
For example when working with LED we have to render each panel on a separate render-node (a PC specifically designed for the task). In order to avoid glitches and delays, the GPUs have to be genlocked to render at the exact same time.
The Nvidia Quadro 8000 is designed to a server grade standards
The architecture of the card itself is built with reliability and durability in mind, for example the way cards ventilation allows for air to be passed back through the case of GPU to outside and down to the way to cards power distribution and consumption, that is all rigged for hard-core use.
Designed for professional and constant on-set or in most cases server rooms, its advanced cooling allows for days if not weeks of constant use without any damage to performance. This is why we use Quadro grade GPU in our On-set GOD BOX™ machines which are also designed specifically to run all day under load on-set.
The Nvidia Quadro RTX 8000 is able to be combined with multiple GPU over NVLink this currently isn’t relevant to virtual production in Unreal Engine as UE doesn’t support mutli-GPU processing, although its a feature being worked on and is firmly on Epics to do list. Multi-GPU can be used in many other rendering application though, so its cool to have as a feature.
The RTX2080ti is a Developer Indie Level GPU for VP
The RTX 2020TI is a middle of the range consumer GPU option from Nvidia. This is a common choice for most folks looking to get into virtual production development, its got a decent 8GB GDDR6 video memory which will allow for semi-complex scenes to be rendered in real-time without too much of a hit in frame rate.
The 2080TI is realtime ray tracing enabled with 8Giga Rays/s
This GPU is a good virtual production development level tool but thats about it as far as professional on-set use goes, as when you open up a heavy level your FPS will seriously drop. Open up two view ports and you’ll be lucky to hit 60FPS in each. This is fine for testing and RND although it wouldn’t be taken on-set, not by us anyway, as it doesn’t have a sync port and can’t be genlocked with the rest of the onset GPU’s. On smaller indie productions with just one render engine and one display, this wouldn’t be so much of an issue. However a 2080Ti isn’t designed with the intense use cases as the Quadro range.
The RTX 2080ti have less engineered cooling systems where air is pushed out into the computer case itself, not out of the box this pushes more need for internal fans.
Hell, if you stacked a number of them in one box like many offline-renderers do for say Octane, you need to essentially build a jet stream of air to expel the heat from your case, other wise you’ll get performance drop as the temperature of your CPU and GPUs rises beyond your bios settings. AKA even more FPS loss and lag.
This GPU comes in at roughly £1,199 a fair price for the performance which comes with it and this GPU is as we say great for Independent set ups and development. It will do 90% of what a developer needs without any issues and as a basic UE Artist or Developer needs it will be fine in any studio or bedroom studio, but if you want to bring your machine on-set, go Quadro.
Here’s a rundown of the Quadros, prices, basic specifications:
At its core, virtual production is about bringing people together to tell stories, VP tools aid storytellers to capture performances in real-time, including those of the camera crew. Virtual cameras capture human camera moves, recording the camera move to a timeline. Once the performance is captured digitally it can be replayed in virtual space, shared and even altered. This applies to shooting everything in engine from real-time mo-cap characters to in engine VFX, virtual cameras are set to become a story telling tool.
Do you need a professional camera tracker to create a really good virtual camera, the answer is no. With £120 VIVE tracker and some bits and bobs of eBay you can build a truly awesome VR camera rig and start shooting real-time cinematics. We’ll be following this post up with tutorials later.
VIVE Simulcam Virtual Camera Rig designed by DVP Asa Bailey.
VIVE as a camera tracker
Solid for room scale fully VR shooting and virtual scouting hands down winner for this. Next we’ll try shooting real-time mo-cap on this rig. But now the bad news, the VIVE is not good for shooting mixed reality. It fails to hold visual sync where you need it between the optical camera SDI I/O with virtual CG layer.
Out the box here the VIVE fails big time, it’s very slippy. You need sync, ideally live link too so you can also see camera movement in editor mode and not just play, using the VIVE in a mixed reality shooting was frustrating, impossible.
More advanced developers will be able to create a solution for this, you can also look at how to talk to the VIVE directly, but as an out the box mixed reality, virtual studio solution, the VIVE has no features and so for this reason we are saying at the moment, out the box the VIVE is very much only for fully virtual production.
VIVE Simulcam Virtual Camera Rig designed by DVP Asa Bailey.OSFOSF
But here’s the good news, it’s crazy good as a tracker for shooting fully immersed in engine, on OSF render engines the response of the VIVE tracker was insane, the level of grief we put this tracker through, it shocked us how rock solid the response rates were. Always immediate. The VIVE tracker has a 270 degree of vision from its odd shape, and we are concerned about the strength of the base stations. Next we want to battle test the VIVE tracker on really big stage with a ton of mobcap and animation, the base stations work great in a room with a distance of 5m, but what will happen when we really push this, jury is out.
Following, next week we’ll get into the data and renders.
So what’s the deal with the VIVE as a camera tracker?
We’d conclude if you want to shoot fully virtual, shooting in engine cinematic is amazing with a VIVE as your camera input, get a VIVE for £100. If you want to do any serious mixed reality virtual production work or real-time VFX previz, your still going to need to open your pocket and find a professional budget to get the right equipment for the job. But if you want to shoot 100% in engine, dive in – get a VIVE.