AI Invisibility in Unreal Engine

AI Invisibility in Unreal Engine

“AI driven HLSL materials are coming to UE.” Asa Bailey, Founder On-set Facilities.



When you see one thing and it makes you go on a logic bender. Today I spotted a comment by Epic Games on a post about the AI rotoscope application / NUKE plugin, called This clever bit of code is the work of AI VFX guru Sam Hodge.

What Sam has created is an AI program he calls Rotobot that essentially lets you select a subject or many, in your footage and then run the BOT code to automatically rotoscope it and crate a mask. Wizard right? I think so, but then Epic asked Sam if he could get the Rotobot to run in HLSL. Sam said he didn’t think so, which sent me down my logic rabbit hole.

UPDATE [6/52019] what is you can not see what you want the machine to render, or it’s out of focus? Step in “Soft Rasterizer: A Differentiable Renderer for Image-based 3D Reasoning” basically it see’s pixels, makes computations based on learning from shapes in other images and applies the learning to the render. You can see examples here and download the full Soft Rasterizer code from the projects GitHub.

AI Rotoscoping in NUKE

Cut it out. Nuke power VFX for Sony Alice in Wonderland.

First let me say this is all based purely on my morning coffee logic. I woke up today with realtime pipelines rattling around in my head. Now, I assume Rotobot is written in C++ (??) if so, then theoretically you could pass a float from the C++ code, to HLSL and use the Ai logic that’s running on the CPU to drive a custom HLSL shader in UE through the custom node editor and run this on the GPU, which I believe is what Epic was thinking in the original comment.

Automatic Green Screen Rotoscoping and Invisible Shaders

What this would mean is that in principle someone could write a custom shader node for unreal engine, that would have AI driven properties. Call me Alice, but this opens up a whole new level of thinking. Ai controlled unreal materials. Not just unreal but any application that uses HLSL shaders, which is lots.

In the demo videos from you can see how this AI chroma / rotoscope application can be used to cut out on the daily drudgery of the rotoscoping task, not to mention the costs, (as thousands of rotoscopes artists start crying into their milk) it really is a leap in AI automated FX, but it’s not realtime and its not in unreal. Let’s try and fix that.

Mixed Reality in Unreal Engine

Do it now, here’s the tutorial on creating mixed reality in Unreal Engine.

Realtime Chromakey / Rotoscoping in Unreal Engine (UE) will happen.

The link will be made. Some of you clever clogs out there, hopefully sparked off by my mad ramblings, will surely make this happen. When the link between Ai and HLSL shaders is made, porting floats to HLSL customers nodes in realtime, when these computation process take advantage of accessing multiple CPU and GPU configurations, we’ll have realtime AI driven rotoscoping and therefore the possibility of creating invisible textures.

AI VFX will redress the CPU / GPU balance.

The demise in popularity of the CPU may have run its course as AI loves the CPU, meanwhile graphics love the GPU, so maybe this is where the balance gets redressed. As maybe the perfect AI VFX graphics workstation would run its AI logic on its multiple CPU processors and it will be buffering data to HLSL shader logic that will crunch the GPU’s, all in what we could  class as realtime. Realtime mixed reality gaming, VFX for film, TV, augmented reality, imagine what tricks we could cook up with realtime AI driven transparent materials.

Invisibility, anyone?



%d bloggers like this: