Posted on Leave a comment

Miniature Green Screen for Developing Virtual Production

#d Photoreal Enviroment

Calling all would be virtual production pipeline developers out there, just to say you don’t need a big studio or even a big green screen to work on your VP compositing pipelines. Remembering where we came from, 5 years ago we started OSF with essentially a PC, a cut-off off roll of chroma green paper, and a lamp, and we set up our first VP pipeline in our front room. Today we develop pipelines for some of the worlds top Directors, VFX studios and content studios, if we can do it, so can you. So what are you waiting for, go grab something green, or blue.

Continue reading Miniature Green Screen for Developing Virtual Production

Posted on Leave a comment

Applying Audio2Face to your 3D Characters

Audio2Face Metahuman

Using the Nvidia Omniverse Audio2Face kit users are able to generate real-time AI powered facial animations from a single audio source (See here for full Audio2Face breakdown). This animation then can be harnessed on any humanoid 3D character through Audio2Faces blend shape animation export. below is a collection of tutorials demonstrating streamlined pipelines for applying Audio2Face animations to your personal 3D characters in various 3D software’s.

Continue reading Applying Audio2Face to your 3D Characters

Posted on 1 Comment

In-Camera VFX in Unreal Engine 4.27 Tutorial

In-Camera VFX unreal engine 4.27

Epic Games released Unreal Engine 4.27 last month bringing buckets of new features and updates across all sectors. Today we’re looking over the In-Camera VFX pipeline which has had numerous changes in UE4.27, improving both usability and performance. The new version of In-Camera VFX replaces the need of manually configuring config files externally and now presents users with a new interface within the engine to configure layouts, pixel pitches and projection policy’s.

What’s new?

Ndisplay Config Asset – This’ll be the first of new Ndisplay related asset you’ll use it’s found under Ndisplay in the content browser. Ndisplay Config Asset will be where you configure your stage setup and what you’ll use to bring your stage into levels and scenes.

Ndisplay 3D Config Editor – This is the replacement for the previous .CFG config file which was created externally and provided the engine with important parameters like Projection policies, viewport and node configurations. Now the Ndisplay 3D Config Editor does all of the above and provides new features allowing for easier visualization. The interface is broke into 4 specific tabs and a details panel.

Set the test scene

  1. Components – Showing the imported static meshes, frustum camera and sync tick components added to the config uasset.
  2. Cluster – This tab visualises the cluster layout and hierarchy. Showing specific viewports and their dedicated server.
  3. Output Mapping – Output mapping provides users with a visualization of the output canvas in relation to the inputted pixel resolutions, whilst also demonstrating which server renders each viewport.
  4. Configuration Viewport – this tab allows a simple visaulistion of the stage itself and allows users to place the stage’s location accurately in relation to the real world.

Switch BoardSwitch board now stands in for the previous Launcher and listener apps whilst also adding a range of new user friendly features. These features include the ability to connect, launch and monitor all running nodes.


Posted on 1 Comment

AI Compositing with NVIDIA BROADCAST inside Unreal Engine

What is Nvidia Broadcast?

Nvidia Broadcast is one of few software’s which provides a AI driven compositing solution to users. Nvidia Broadcast allows users to apply effects over a in-coming camera feed, in this demonstration we used the “replace background” feature allowing us to get a composite without the need of any greenscreen; the way Nvidia archives this ability is by harnessing the power of both machine learning image analysis and depth perception.

Once we have our composited green background in Nvidia Broadcast we move forwards to streaming the feed into Unreal Engine. From the real-time engine we have the ability to place the talent into any 3D environment and now aren’t limited to the previous 2D dimensions.

See below our tutorial and demonstration video:

Relevant links and software’s:

Whats Next?

Moving forward, this proof of concept has proved itself to work and now needs officially testing back in our OSF lab on a GODBOX™ with a 4K camera feed input.

Posted on Leave a comment

Unreal Engine Virtual Production Showcase


When the UE mothership sings the virtues of virtual production.