Looking forward to our trip to Vancouver, one of the largest VR/AR production hubs in the world and home of our Canadian Partners. We can’t wait to see what virtual production gems Vancouver has to offer.
How do you focus on something that is not there, but is seen “in” the LED wall, something like an animated digital character, that’s inside the LED virtual set?
Pulling focus to objects and subjects that are displayed on an LED wall.
Virtual Production Training / Director of Virtual Production Asa Bailey.
You want to pull focus from the cast on-set to the fully virtual character that’s being displayed on the LED wall. The problem, if you role your optical focal point forward from position A to position B with the on-set camera, as you roll to and then through the actual front of an LED wall, you’ll end up hitting the moiré zone. Moiré is a French word meaning “having a wavy or ripply surface pattern” and it is what we call the effect when the on-set camera focus, hits the LED wall and creates a mess of lines and artifacts in your shot. Not good. So how do you shoot around this and pull focus to subjects inside the LED virtual set?
Continue reading How to pull focus on subjects displayed on the LED wall | Virtual Cinematography
In tech terms the word metaverse refers to a pervasive, always on, digital multi-dimension, that you can see and interact with, think of the internet with millions of web pages you access through a browser, now imagine the internet on everything around you, you can see it, touch it. Also, it’s not one virtual world or dimension, its many, no one company will own the metaverse.
You switch on your AR glasses, and you see a virtual, real-time, overlay of the metaverse, graphics that you can touch over your real world. You can jump in and out of this digital twin universe, it’s AR, VR, you flick through services, apps ways to communicate and collaborate, on/off.
That’s the visual internet or Metaverse dream.
Continue reading The Metaverse, what it is and why its happening
On-Set Facilities CEO Asa Bailey helms virtual productions both on-set and in the cloud and since the pandemic hit, he’s never been busier. In this post we sit down with Bailey to try and extract more about his work as a Virtual Production Supervisor. This post has been updated.
Shooting fully virtual, in-engine is one of the most liberating ways to shoot. Instead of the usual metal seen on any other shoot, there’s no real camera. We use a tethered simul-cam or virtual camera rig, built using an ATOMOS for both view-finder and as an onboard recorder. When we see the shot we want we simply press record on the camera’s recorder and we capture the shot, in real-time. Thanks to the way the GODBOX Real-time Ready architecture works, there’s almost no delay between the virtual camera rig’s operator moves and what’s recorded on the drive. Optical motion capture technology is not cheep, it’s certainly a professional mo-cap solution including all the Vicon cameras (6 VARO 2) a set of passive / active markers and the VICON Shogun software.
The first independent VP test studio of it’s kind in the UK, the new Digital Catapult and Target3D run research and innovation studio will allow independent and early stage production companies in the media and creative industries to experiment with Virtual Production Technologies, and develop new tools and applications.
The Varjo XR-3 is a enterprise level HMD (head mounted display) System that provides high end, high fidelity AR and VR experiences. The XR-3 is one of the most advanced headsets on the market with a extensive specification allowing for up to 2880 x 2720 pixels per eye and refresh rates of up to 90Hz.
The race is on, there’s a whole new breed of creative on the way and it’s going to make the shift from shooting on film to digital, look like Pac Man.
In tech circles there is something called coopetition, its where competitive companies work together to serve and innovate to the benefit of the whole (aka NVIDIA, Intel). As they say, “a rising tide lifts all” and that’s the case now with the creative production industry too.
Epic Games released Unreal Engine 4.27 last month bringing buckets of new features and updates across all sectors. Today we’re looking over the In-Camera VFX pipeline which has had numerous changes in UE4.27, improving both usability and performance. The new version of In-Camera VFX replaces the need of manually configuring config files externally and now presents users with a new interface within the engine to configure layouts, pixel pitches and projection policy’s.
Ndisplay Config Asset – This’ll be the first of new Ndisplay related asset you’ll use it’s found under Ndisplay in the content browser. Ndisplay Config Asset will be where you configure your stage setup and what you’ll use to bring your stage into levels and scenes.
Ndisplay 3D Config Editor – This is the replacement for the previous .CFG config file which was created externally and provided the engine with important parameters like Projection policies, viewport and node configurations. Now the Ndisplay 3D Config Editor does all of the above and provides new features allowing for easier visualization. The interface is broke into 4 specific tabs and a details panel.
- Components – Showing the imported static meshes, frustum camera and sync tick components added to the config uasset.
- Cluster – This tab visualises the cluster layout and hierarchy. Showing specific viewports and their dedicated server.
- Output Mapping – Output mapping provides users with a visualization of the output canvas in relation to the inputted pixel resolutions, whilst also demonstrating which server renders each viewport.
- Configuration Viewport – this tab allows a simple visaulistion of the stage itself and allows users to place the stage’s location accurately in relation to the real world.
Switch Board – Switch board now stands in for the previous Launcher and listener apps whilst also adding a range of new user friendly features. These features include the ability to connect, launch and monitor all running nodes.
What is Nvidia Broadcast?
Nvidia Broadcast is one of few software’s which provides a AI driven compositing solution to users. Nvidia Broadcast allows users to apply effects over a in-coming camera feed, in this demonstration we used the “replace background” feature allowing us to get a composite without the need of any greenscreen; the way Nvidia archives this ability is by harnessing the power of both machine learning image analysis and depth perception.
Once we have our composited green background in Nvidia Broadcast we move forwards to streaming the feed into Unreal Engine. From the real-time engine we have the ability to place the talent into any 3D environment and now aren’t limited to the previous 2D dimensions.
See below our tutorial and demonstration video:
Relevant links and software’s:
- Nvidia Broadcast
- Unreal Engine 4
- OBS Studio
- Off-World-Live / Spout2 (OBS plugin)
- Off-World-Live (Unreal Engine Plugin)
Moving forward, this proof of concept has proved itself to work and now needs officially testing back in our OSF lab on a GODBOX™ with a 4K camera feed input.