Unreal Engine 5 Trailer

Unreal Engine 5 Trailer

Today Epic Games announced the new Unreal Engine 5 coming to us in 2021. Epic released a mind blowing trailer featuring several unseen features set to totally change content creation in game, tv and film. (And basically we’re getting a PS5) 

Lumin

Lumin is the name of the new global illumination system which brings the full power of 8K virtual textures to Unreal Engine 5. These virtual textures allow for totally real-time and dynamic lighting to the highest standard without any baking. 

Unreal Engine 5: First look on PS5 looks like a jaw-dropping ...

Nanite

Epic also announced the new virtual geometry system called Nanite, which allows artists to import millions of polygon assets without a decrease in performance. This takes out the need for LOD’s or any kind optimisation. The demo showcased by Epic was allegedly made up from hundreds of billions of polygons and all ran on a playstation 5 in real-time. 

This is next-gen: see Unreal Engine 5 running on PlayStation 5 ...

The Chaos destruction system is now able to be used on dynamic meshes like cloth to give more realistic effects aswell as normal geometry destructions. Epic improved the animation workflows adding new developments like predictive foot-placement and motion warping to allow for smoother more natural animations. 

Changing ARRI Camera Setting in Real-Time with UE4

Changing ARRI Camera Setting in Real-Time with UE4

ARRI CAP the back door to ARRI Camera Settings, allows VP developers to control on-set cameras from the VP system inside Unreal Engine.

When I spotted this post about how to change ARRI LUTS live in-camera during recording / streaming. I thought about how the ARRI CAP (Camera Access Protocol) may be your gateway to controlling the ARRI Alexa from any device. 

When we drilled in a little deeper to see what I could find out about ARRI CAP and what else you could control remotely using the cameras API. The answer seems to be pretty much anything you can do in camera, at the on-camera menu you can do remotely using a long list of available call and set commands.

Mo-Sys

Mo-Sys Startracker Absolute Camera Tracking Solution

Integrating ARRI CAP API for Virtual Production Systems

If you shoot an optical layer of the actors on-set performance in-camera, in say a live composited virtual production you will have a virtual replica of the on-set camera in your engine, it’s called your virtual camera. The movement of the on-set camera is tracked using camera tracking technology and the camera operators performance is all captured and streamed to the virtual camera in the engine. The V-cam shoots the CG elements of the scene while the on-set optical camera shoots the real actors on set. These shots are then composited in real-time on the VP system.

It’s a challenge of virtual production, to calibrate the settings on both the optical cameras and the virtual in-engine cameras, especially in between takes and set-ups, and until we found this, impossible during takes. But with ARRI CAP it looks like we can not only control the on-set cameras LUT, but also the cameras exposure, FPS, ISO and a raft of other settings, all through an interface accessed on a web browser or an in-engine UI plug-in. This will enable our virtual production crew to quickly calibrate physical and virtual cameras and change any number of camera settings between and during takes.

External Links

ARRI changing LUT files during live streaming

https://ymcinema.com/2019/12/05/arri-revolutionizes-live-streaming-real-time-luts-mixing-to-elevate-cinematic-look/

ARRI CAP Settings

https://www.arri.com/resource/blob/32390/9b6f862d3a0c8893b6577530e74be324/18-06-18-cap-2-5-camera-access-protocol-feature-list-alexa-mini-amira-sup-5-3-data.pdf

Unreal Engine Helps Asa Bailey Deliver Real-Time Virtual Production

Unreal Engine Helps Asa Bailey Deliver Real-Time Virtual Production

Asa Bailey took the role of Director and DVP (Director of virtual production) on the short film series created for NTT Data. Each short film took just 4 days to produce; Asa was able to create such rapid productions with a pioneered new Unreal Engine based real-time Workflow. In this article we’ll walk you through this real-time ready Workflow and how virtual production took the production to new heights.

The series was created for a company called NTT Data, one of the largest IT services providers in the world with more than 120,000 employees globally. The series was created to show the future direction of NTT Data within each different industry sector, with industry’s spanning from automotive to healthcare. The series focused on individuals being sent to the future to be awakened on the future of their company. The clients wanted a very sci-fi aesthetic to the series and we archived this through fully virtual sets throughout the production.

Asa Bailey Director and Director of Virtual Productiuon

Asa Bailey Director and Director of Virtual Production on set.

Pre-Production

The pre-production on a real-time project like this was crucial, Because of the huge cuts in post production with virtual production methods we have to compensate with pre-production. There was a continuous back and fourth between Asa and the 3D team. Deciding animations, sets and colour schemes. The short films harnessed virtual and physical objects, meaning there was still a strong need for the arts department. In our workflow we keep anything touched or held by the talent a physical object, creating a augmented production between the real and virtual worlds.

Virtual Production in Unreal Engine

On-set Production

When on-set Unreal Engine really came into its own, supporting the production with its real-time composites and powering our virtual production system which used a group of software’s including Unreal Engine, Brainstorm’s infinity set and chroma hardware’s.

The virtual production system allowed us to previsualise the entire production in real-time, showing the talent composited within the virtual sets. This was then shown all over the studio allowing the crew and actors to see an idea of the finished product. The system also records in 4 different layers, a composite of just the talent on black, the optical layer, the chroma key and a final composite which is talent and virtual set. This allows total freedom for the production it gives the ability to go down the traditional VFX route with just the optical and a pre done chroma key, or what we did was take 90% of shots straight out of the system and into editing with little touch ups in post.

Another element of the real-time on-set production is the actual camera and optical layers. It’s essential that the virtual and real world lightings match, if there’s a flashing lamppost in the background of the virtual set that needs to be reflected on the real talent.

Virtual Production Unreal Engine

Next Step for Virtual production

New virtual production methods and technologies are constantly surfacing and with the rapid growth of Unreal Engine by Epic Games, virtual production is constantly expanding and providing new opportunity.

The integration of real and digital characters is a massive leap for virtual production. We are now able to have virtual characters and physical actors in the same environment at the same time, whether that environment is virtual or a physical set the digital character can be inserted to either. We can now do this through new real-time facial and motion capture.

As seen in the video below Mo-Sys engineering are bringing this vision into the present through their Unreal Engine plugin and 3D tracking. Here you can see how Mo-Sys bring together tracking, Unreal engine and motion capture to present a almost reversed virtual production with the only virtual asset being the character whilst everything else remains real from the studio to the presenter. The performance was all done in real-time.

Real-Time Lighting and Reflections

Lighting within virtual production can be a massive challenge but now we have the ability to adjust virtual and physical lights in real-time to get the perfect lighting on both, the real and virtual sets. This allows total lighting symmetry with the virtual world creating a more believable finished product.

The development of LED walls is rapidly growing with pitches moving from 2.5mm to 0.7mm in less from 5years. With a 0.7mm pitch on the LED screens its possible to shoot them as backgrounds and provide a totally real-time in-camera VFX option. With normal green screen shoots green spill is a issue for all CG artists but when using LED walls the spill from the screens is actually the perfect lighting for the virtual scene.

On-set Facilities at IBC 2019

On-set Facilities at IBC 2019

Next week we’ll be at the IBC (International Broadcast Convention) show out in Amsterdam from the 13th to the 18th . During this time we will be exhibiting with our partners Mo-Sys engineering on stand 8.F21 in Future Zone and we’ll also have a stand in Hall 6 at stand 6.C12.

On our stand in the Future Zone we’ll be demonstrating a range of different virtual production technologies from Hyper-drive Cameras to LED walls. We’ll be showcasing a range of products from Mo-Sys engineering including virtual production tools, StarTrackers and Motion capture solutions.
Our stands will also offer the opportunity to really get a hands on feel for these new and developing technologies through our live demo’s happening throughout the event.

They’ll be a real-time LED wall demo being directed by Asa bailey a industry leading Director of Virtual Production. During this demo you’ll be able to interact with the walls and see the in camera VFX happening right before your eyes. The LED walls are a interesting new advancement in virtual production due to the ability to capture real-time real reflects on faces, masks, backgrounds all there and then. Normal methods of LED walls can cost upwards of $100,000 we provide a more reachable solution.

We’ll also be hosting a real-time motion capture performance throughout the event which will feature the Unreal Engine, Xsens motion capture suits and Mo-Sys Star trackers. This demo shows the real-time motion capture abilities but also shows the augmented reality virtual production solution allowing you to walk onto green screen and appear in the virtual world in real-time.

Our new HyperDrive camera system will be on showcase this year. The HyperDrive camera has the ability to be controlled remotely from anywhere with a minuscule delay and close a almost zero latency. This opens up the shooting abilities in sports, wildlife or automotives allowing operators to work remotely and therefore saving production costs.

If your in IBC drop by and have a chat, We’d love to hear what your working on and see if we can help with our new technologies.Contact us now to organize your meeting.

OTOY and Epic Games Realease OctaneRender for Unreal Engine

OTOY and Epic Games Realease OctaneRender for Unreal Engine

OTOY and Epic Games announced the release of OctaneRender for Unreal Engine at Siggraph this week and how exactly the two major 3D pipelines will work together. This collaboration will give artists the tools to create even greater cinematic experiences, from real-time previz to archviz.

OctaneRender For Unreal will allow artists to import, export and mix ORBX interchange assets from 2 dozen Octane DCC integrations. Octanes Materials and Node Graphs will be 100% accessible though Unreals Node Editor. This allows the ability of cinematic renders from within a real-time engine, allowing artists to work in real-time with no decrease in quality.

The new Plugin will compliment Unreals ray-tracing features and provide a even glossier finish. Octane will also provide a final frame rendering natively within Unreal Engine for the first time.

See the source image

Octane for Unreal Engine features:

  • OctaneRender fully integrated into Unreal Engine 4’s node editor with intuitive scene navigation and easily accessible render passes within UE4

  • The automatic and fast conversion on Unreal Scenes and materials into Octane.

  • full integration of OctaneRender 2019 features including: Layered Materials, OSL Vector Displacement, OSL Volume Shaders, Universal Camera with OSL Distortion Maps, all-new Light and Geometric Primitives, Fast Spectral Random Walk Subsurface Scattering

  • AI Light, AI Scene, AI Spectral and Volumetric Denoising, and Out-of-Core Geometry, UDIM Support and Light Linking for production-ready final rendering

  • Octane Vectron and Spectron – fully procedural node-based Volumetric Geometry and Lighting for infinite detail and granular lighting control

  • The OTOY ORBX format will support over 20 of the industry leading 3D tools including – Cinema4D, Autodesk Maya, and 3ds Max. this allows artists to drag and drop scenes from these tools straight into Unreal Engine.

This Support between Octane and Unreal Engine is a massive milestone for Unreal Engine. To have full suppor for a leading render engine like OctaneRender is a clear sign of how the 3D industry is turning to totally real-time pipelines.