In Unreal Engine 4.27 there’s the new LiveLink plugin, “LiveLinkXR” which allows users to bring in live data of trackers and HMD’s. The XR plugin currently only supports SteamVR but any VR devices connected can be imported. The plugin is extremely straight forwards to use just simply add an XR LiveLink source and select the desired devices (HMD, Controllers and Trackers) and see the green light confirming the engine is receiving live data.
Using the Nvidia Omniverse Audio2Face kit users are able to generate real-time AI powered facial animations from a single audio source (See here for full Audio2Face breakdown). This animation then can be harnessed on any humanoid 3D character through Audio2Faces blend shape animation export. below is a collection of tutorials demonstrating streamlined pipelines for applying Audio2Face animations to your personal 3D characters in various 3D software’s.
Nvidia’s Audio2Face Omniverse Kit harnesses the power of deep learning AI technology to provide real-time facial animation from a single audio source. Audio2Face allows artists to simplify 3D character facial animation and instantly generate facial expression’s and reactions from voice-overs. Audio2Face also allows users to retarget the captured animations to any 3D human or human-esque face whether realistic or Stylized.
See below a table of the relation of a 4k LED processor canvas to physical space and measurements of LED volume. The real space value is determined by both the pixel pitch and resolution of each panel.
LED panel calibration is a essential element of any in-camera VFX production. The calibration across panels is designed to address three main areas: colour / brightness formatting, uniformity and seamlessness. Correct colours on the LED stage itself is one of the most key, yet difficult processes of calibration. Brompton Technologies is combating this issue with the new Dynamic Calibration engine in the Tessera processers. Title image credit MGX Films.
Nanite is the new and improved virtualized geometry system implemented in Unreal Engine 5. Nanite provides a internal format for imported meshes, creating “Nanite Meshes” instead of the previous “Static Meshes”. Through the use of new rendering technology Nanite is able to render pixel scale and high poly counts effortlessly. Nanite harnesses new Level of Detail (LOD) rendering methods which allow for differing detail and polycounts from every unique perspective.
Lumen harnesses a fully dynamic indirect lighting pipeline which works in unison with any environments geometries, materials and light properties to provide optimized artist workflow. The optimized workflow is achieved through the full dynamic indirect lighting pipeline providing instant lighting builds and reflection captures. Lumen removes the need for reflection cubemaps, since it fully replaces other methods and is able to render geometrically precise reflections.
The Varjo XR-3 is a enterprise level HMD (head mounted display) System that provides high end, high fidelity AR and VR experiences. The XR-3 is one of the most advanced headsets on the market with a extensive specification allowing for up to 2880 x 2720 pixels per eye and refresh rates of up to 90Hz.
Epic Games released Unreal Engine 4.27 last month bringing buckets of new features and updates across all sectors. Today we’re looking over the In-Camera VFX pipeline which has had numerous changes in UE4.27, improving both usability and performance. The new version of In-Camera VFX replaces the need of manually configuring config files externally and now presents users with a new interface within the engine to configure layouts, pixel pitches and projection policy’s.
Ndisplay Config Asset – This’ll be the first of new Ndisplay related asset you’ll use it’s found under Ndisplay in the content browser. Ndisplay Config Asset will be where you configure your stage setup and what you’ll use to bring your stage into levels and scenes.
Ndisplay 3D Config Editor – This is the replacement for the previous .CFG config file which was created externally and provided the engine with important parameters like Projection policies, viewport and node configurations. Now the Ndisplay 3D Config Editor does all of the above and provides new features allowing for easier visualization. The interface is broke into 4 specific tabs and a details panel.
- Components – Showing the imported static meshes, frustum camera and sync tick components added to the config uasset.
- Cluster – This tab visualises the cluster layout and hierarchy. Showing specific viewports and their dedicated server.
- Output Mapping – Output mapping provides users with a visualization of the output canvas in relation to the inputted pixel resolutions, whilst also demonstrating which server renders each viewport.
- Configuration Viewport – this tab allows a simple visaulistion of the stage itself and allows users to place the stage’s location accurately in relation to the real world.
Switch Board – Switch board now stands in for the previous Launcher and listener apps whilst also adding a range of new user friendly features. These features include the ability to connect, launch and monitor all running nodes.
What is Nvidia Broadcast?
Nvidia Broadcast is one of few software’s which provides a AI driven compositing solution to users. Nvidia Broadcast allows users to apply effects over a in-coming camera feed, in this demonstration we used the “replace background” feature allowing us to get a composite without the need of any greenscreen; the way Nvidia archives this ability is by harnessing the power of both machine learning image analysis and depth perception.
Once we have our composited green background in Nvidia Broadcast we move forwards to streaming the feed into Unreal Engine. From the real-time engine we have the ability to place the talent into any 3D environment and now aren’t limited to the previous 2D dimensions.
See below our tutorial and demonstration video:
Relevant links and software’s:
- Nvidia Broadcast
- Unreal Engine 4
- OBS Studio
- Off-World-Live / Spout2 (OBS plugin)
- Off-World-Live (Unreal Engine Plugin)
Moving forward, this proof of concept has proved itself to work and now needs officially testing back in our OSF lab on a GODBOX™ with a 4K camera feed input.