Posted on 1 Comment

Vive Tracking Plugin Unreal Engine 4.27

HTC Vive Tracker

In Unreal Engine 4.27 there’s the new LiveLink plugin, “LiveLinkXR” which allows users to bring in live data of trackers and HMD’s. The XR plugin currently only supports SteamVR but any VR devices connected can be imported. The plugin is extremely straight forwards to use just simply add an XR LiveLink source and select the desired devices (HMD, Controllers and Trackers) and see the green light confirming the engine is receiving live data.

Continue reading Vive Tracking Plugin Unreal Engine 4.27

Posted on Leave a comment

Applying Audio2Face to your 3D Characters

Audio2Face Metahuman

Using the Nvidia Omniverse Audio2Face kit users are able to generate real-time AI powered facial animations from a single audio source (See here for full Audio2Face breakdown). This animation then can be harnessed on any humanoid 3D character through Audio2Faces blend shape animation export. below is a collection of tutorials demonstrating streamlined pipelines for applying Audio2Face animations to your personal 3D characters in various 3D software’s.

Continue reading Applying Audio2Face to your 3D Characters

Posted on 1 Comment

AI Real-Time facial Animation in Nvidia Omniverse

Nvidia Audio2Face

Nvidia’s Audio2Face Omniverse Kit harnesses the power of deep learning AI technology to provide real-time facial animation from a single audio source. Audio2Face allows artists to simplify 3D character facial animation and instantly generate facial expression’s and reactions from voice-overs. Audio2Face also  allows users to retarget the captured animations to any 3D human or human-esque face whether realistic or Stylized.

Continue reading AI Real-Time facial Animation in Nvidia Omniverse

Posted on Leave a comment

LED Stage Calibration for In-camera VFX

Brompton Dynamic Calibration

LED panel calibration is a essential element of any in-camera VFX production. The calibration across panels is designed to address three main areas: colour / brightness formatting, uniformity and seamlessness. Correct colours on the LED stage itself is one of the most key, yet difficult processes of calibration. Brompton Technologies is combating this issue with the new Dynamic Calibration engine in the Tessera processers. Title image credit MGX Films.

Continue reading LED Stage Calibration for In-camera VFX

Posted on 2 Comments

What is Nanite in Unreal Engine 5?

Nanite in Unreal Engine 5

Nanite is the new and improved virtualized geometry system implemented in Unreal Engine 5. Nanite provides a internal format for imported meshes, creating “Nanite Meshes” instead of the previous “Static Meshes”. Through the use of new rendering technology Nanite is able to render pixel scale and high poly counts effortlessly. Nanite harnesses new Level of Detail (LOD) rendering methods which allow for differing detail and polycounts from every unique perspective.

Continue reading What is Nanite in Unreal Engine 5?

Posted on 2 Comments

What is Lumen in Unreal Engine 5?

Unreal Engine 5 Lumen

Lumen is the new global illumination and reflection system created within Unreal Engine 5 to support the new generations of gaming consoles.

Lumen harnesses a fully dynamic indirect lighting pipeline which works in unison with any environments geometries, materials and light properties to provide optimized artist workflow. The optimized workflow is achieved through the full dynamic indirect lighting pipeline providing instant lighting builds and reflection captures. Lumen removes the need for reflection cubemaps, since it fully replaces other methods and is able to render geometrically precise reflections.

Continue reading What is Lumen in Unreal Engine 5?

Posted on Leave a comment

System Requirements for powering the Varjo XR-3 VR Head Set

Varjo XR3 enterprise VR

The Varjo XR-3 is a enterprise level HMD (head mounted display) System that provides high end, high fidelity AR and VR experiences. The XR-3 is one of the most advanced headsets on the market with a extensive specification allowing for up to 2880 x 2720 pixels per eye and refresh rates of up to 90Hz.

Continue reading System Requirements for powering the Varjo XR-3 VR Head Set

Posted on 1 Comment

In-Camera VFX in Unreal Engine 4.27 Tutorial

In-Camera VFX unreal engine 4.27

Epic Games released Unreal Engine 4.27 last month bringing buckets of new features and updates across all sectors. Today we’re looking over the In-Camera VFX pipeline which has had numerous changes in UE4.27, improving both usability and performance. The new version of In-Camera VFX replaces the need of manually configuring config files externally and now presents users with a new interface within the engine to configure layouts, pixel pitches and projection policy’s.

What’s new?

Ndisplay Config Asset – This’ll be the first of new Ndisplay related asset you’ll use it’s found under Ndisplay in the content browser. Ndisplay Config Asset will be where you configure your stage setup and what you’ll use to bring your stage into levels and scenes.

Ndisplay 3D Config Editor – This is the replacement for the previous .CFG config file which was created externally and provided the engine with important parameters like Projection policies, viewport and node configurations. Now the Ndisplay 3D Config Editor does all of the above and provides new features allowing for easier visualization. The interface is broke into 4 specific tabs and a details panel.

Set the test scene

  1. Components – Showing the imported static meshes, frustum camera and sync tick components added to the config uasset.
  2. Cluster – This tab visualises the cluster layout and hierarchy. Showing specific viewports and their dedicated server.
  3. Output Mapping – Output mapping provides users with a visualization of the output canvas in relation to the inputted pixel resolutions, whilst also demonstrating which server renders each viewport.
  4. Configuration Viewport – this tab allows a simple visaulistion of the stage itself and allows users to place the stage’s location accurately in relation to the real world.

Switch BoardSwitch board now stands in for the previous Launcher and listener apps whilst also adding a range of new user friendly features. These features include the ability to connect, launch and monitor all running nodes.

 

Posted on 1 Comment

AI Compositing with NVIDIA BROADCAST inside Unreal Engine

What is Nvidia Broadcast?

Nvidia Broadcast is one of few software’s which provides a AI driven compositing solution to users. Nvidia Broadcast allows users to apply effects over a in-coming camera feed, in this demonstration we used the “replace background” feature allowing us to get a composite without the need of any greenscreen; the way Nvidia archives this ability is by harnessing the power of both machine learning image analysis and depth perception.

Once we have our composited green background in Nvidia Broadcast we move forwards to streaming the feed into Unreal Engine. From the real-time engine we have the ability to place the talent into any 3D environment and now aren’t limited to the previous 2D dimensions.

See below our tutorial and demonstration video:

Relevant links and software’s:

Whats Next?

Moving forward, this proof of concept has proved itself to work and now needs officially testing back in our OSF lab on a GODBOX™ with a 4K camera feed input.