Posted on Leave a comment

Miniature Green Screen for Developing Virtual Production

#d Photoreal Enviroment

Calling all would be virtual production pipeline developers out there, just to say you don’t need a big studio or even a big green screen to work on your VP compositing pipelines. Remembering where we came from, 5 years ago we started OSF with essentially a PC, a cut-off off roll of chroma green paper, and a lamp, and we set up our first VP pipeline in our front room. Today we develop pipelines for some of the worlds top Directors, VFX studios and content studios, if we can do it, so can you. So what are you waiting for, go grab something green, or blue.

Continue reading Miniature Green Screen for Developing Virtual Production

Posted on Leave a comment

NukeX Plugin for Unreal Engine 4.27

Nuke Unreal Engine Plugin

This week Foundry released “UnrealReader” their new Unreal Engine plugin for real-time visualisation of Unreal 4.27 scenes in NukeX. The plugin allows for connection of both software’s through the utilization of UnrealReader nodes within NukeX and a NukeServer in Unreal Engine 4.27.

Continue reading NukeX Plugin for Unreal Engine 4.27

Posted on 1 Comment

Vive Tracking Plugin Unreal Engine 4.27

HTC Vive Tracker

In Unreal Engine 4.27 there’s the new LiveLink plugin, “LiveLinkXR” which allows users to bring in live data of trackers and HMD’s. The XR plugin currently only supports SteamVR but any VR devices connected can be imported. The plugin is extremely straight forwards to use just simply add an XR LiveLink source and select the desired devices (HMD, Controllers and Trackers) and see the green light confirming the engine is receiving live data.

Continue reading Vive Tracking Plugin Unreal Engine 4.27

Posted on Leave a comment

Applying Audio2Face to your 3D Characters

Audio2Face Metahuman

Using the Nvidia Omniverse Audio2Face kit users are able to generate real-time AI powered facial animations from a single audio source (See here for full Audio2Face breakdown). This animation then can be harnessed on any humanoid 3D character through Audio2Faces blend shape animation export. below is a collection of tutorials demonstrating streamlined pipelines for applying Audio2Face animations to your personal 3D characters in various 3D software’s.

Continue reading Applying Audio2Face to your 3D Characters

Posted on 1 Comment

AI Real-Time facial Animation in Nvidia Omniverse

Nvidia Audio2Face

Nvidia’s Audio2Face Omniverse Kit harnesses the power of deep learning AI technology to provide real-time facial animation from a single audio source. Audio2Face allows artists to simplify 3D character facial animation and instantly generate facial expression’s and reactions from voice-overs. Audio2Face also  allows users to retarget the captured animations to any 3D human or human-esque face whether realistic or Stylized.

Continue reading AI Real-Time facial Animation in Nvidia Omniverse

Posted on Leave a comment

GODBOX High Performance Computers (HPC) Power LED Walls Stages and Studios

What is HPC

High Performance Computing in general terms refers to the practice of aggregating computing power in a way that delivers much higher performance than one could get out of a typical desktop computer or workstation.

Continue reading GODBOX High Performance Computers (HPC) Power LED Walls Stages and Studios

Posted on 2 Comments

What is Lumen in Unreal Engine 5?

Unreal Engine 5 Lumen

Lumen is the new global illumination and reflection system created within Unreal Engine 5 to support the new generations of gaming consoles.

Lumen harnesses a fully dynamic indirect lighting pipeline which works in unison with any environments geometries, materials and light properties to provide optimized artist workflow. The optimized workflow is achieved through the full dynamic indirect lighting pipeline providing instant lighting builds and reflection captures. Lumen removes the need for reflection cubemaps, since it fully replaces other methods and is able to render geometrically precise reflections.

Continue reading What is Lumen in Unreal Engine 5?

Posted on Leave a comment

How to pull focus on subjects displayed on the LED wall | Virtual Cinematography

How do you focus on something that is not there, but is seen “in” the LED wall, something like an animated digital character, that’s inside the LED virtual set?

Pulling focus to objects and subjects that are displayed on an LED wall.

Virtual Production Training / Director of Virtual Production Asa Bailey.

You want to pull focus from the cast on-set to the fully virtual character that’s being displayed on the LED wall. The problem, if you role your optical focal point forward from position A to position B with the on-set camera, as you roll to and then through the actual front of an LED wall, you’ll end up hitting the moiré zone. Moiré is a French word meaning “having a wavy or ripply surface pattern” and it is what we call the effect when the on-set camera focus, hits the LED wall and creates a mess of lines and artifacts in your shot. Not good. So how do you shoot around this and pull focus to subjects inside the LED virtual set?
Continue reading How to pull focus on subjects displayed on the LED wall | Virtual Cinematography