Posted on Leave a comment

In-Camera VFX in Unreal Engine 4.27 Tutorial

In-Camera VFX unreal engine 4.27

Epic Games released Unreal Engine 4.27 last month bringing buckets of new features and updates across all sectors. Today we’re looking over the In-Camera VFX pipeline which has had numerous changes in UE4.27, improving both usability and performance. The new version of In-Camera VFX replaces the need of manually configuring config files externally and now presents users with a new interface within the engine to configure layouts, pixel pitches and projection policy’s.

What’s new?

Ndisplay Config Asset – This’ll be the first of new Ndisplay related asset you’ll use it’s found under Ndisplay in the content browser. Ndisplay Config Asset will be where you configure your stage setup and what you’ll use to bring your stage into levels and scenes.

Ndisplay 3D Config Editor – This is the replacement for the previous .CFG config file which was created externally and provided the engine with important parameters like Projection policies, viewport and node configurations. Now the Ndisplay 3D Config Editor does all of the above and provides new features allowing for easier visualization. The interface is broke into 4 specific tabs and a details panel.

Set the test scene

  1. Components – Showing the imported static meshes, frustum camera and sync tick components added to the config uasset.
  2. Cluster – This tab visualises the cluster layout and hierarchy. Showing specific viewports and their dedicated server.
  3. Output Mapping – Output mapping provides users with a visualization of the output canvas in relation to the inputted pixel resolutions, whilst also demonstrating which server renders each viewport.
  4. Configuration Viewport – this tab allows a simple visaulistion of the stage itself and allows users to place the stage’s location accurately in relation to the real world.

Switch BoardSwitch board now stands in for the previous Launcher and listener apps whilst also adding a range of new user friendly features. These features include the ability to connect, launch and monitor all running nodes.

 

Posted on 1 Comment

AI Compositing with NVIDIA BROADCAST inside Unreal Engine

What is Nvidia Broadcast?

Nvidia Broadcast is one of few software’s which provides a AI driven compositing solution to users. Nvidia Broadcast allows users to apply effects over a in-coming camera feed, in this demonstration we used the “replace background” feature allowing us to get a composite without the need of any greenscreen; the way Nvidia archives this ability is by harnessing the power of both machine learning image analysis and depth perception.

Once we have our composited green background in Nvidia Broadcast we move forwards to streaming the feed into Unreal Engine. From the real-time engine we have the ability to place the talent into any 3D environment and now aren’t limited to the previous 2D dimensions.

See below our tutorial and demonstration video:

Relevant links and software’s:

Whats Next?

Moving forward, this proof of concept has proved itself to work and now needs officially testing back in our OSF lab on a GODBOX™ with a 4K camera feed input.

Posted on Leave a comment

Disguise nabs Gideon Ferber from Ross Video

Who's behind Disguise

In a move that signs the company moving into the broadcast space, Disguise have lured Gideon Ferber away from Ross Video. In his new role Gideon takes on the tittle of Product Director, Broadcast.

Who’s behind the Disguise.

Not so long ago Epic Games bought a 5% stake in Disguise and the company also bagged a juicy round of VC funding. But this injection means the media server business must now strike out, grow sales and enter new markets. Disguise also attracted Phil Ventre from Ncam earlier in the year, the company succeeding in pulling together an impressive bench of Broadcast industry insiders.

But broadcast is a funny business full of political intrigue and mischief as vendors line up for permanent install gigs and shows. Having the right connections is key to getting in line early on in the project life-cycle. Looking at the Disguise web-site their historical examples of broadcast, showcase a bias towards live event style shows such as big music awards shows.

Looking at the broadcast VP market.

It will be interesting to see how or if, Disguise will wrap up Unreal Engines studio based virtual production features into some sort of paid for software offering for virtual studios or if they will stick to the big live show formats. The virtual studio broadcast space is very mature, with software and hardware from the likes of Brainstorm or Zero Density.

All these broadcast virtual production solutions, including Ross Video’s own Lucid Studio offer broadcast virtual studio operators with a single usable interface for controlling fully virtual studios. The tools include easy ways for operators to create real-time chroma masks, or live integration of outside data sources, and controlling graphics and 3D AR elements.

Unreal already has these broadcast virtual studio style features, all available natively in-engine and these broadcast virtual solutions simply wrap them up and make them more usable. All the above virtual broadcast studio software, including Unreal Engine all run on Windows PC architecture.