Calling all would be virtual production pipeline developers out there, just to say you don’t need a big studio or even a big green screen to work on your VP compositing pipelines. Remembering where we came from, 5 years ago we started OSF with essentially a PC, a cut-off off roll of chroma green paper, and a lamp, and we set up our first VP pipeline in our front room. Today we develop pipelines for some of the worlds top Directors, VFX studios and content studios, if we can do it, so can you. So what are you waiting for, go grab something green, or blue.
GODBOX Computer Desktop Workstations
The thing is, with cloud computing your power is miles away in a data center, that’s not great for latency, and GPU VMs (virtual machines) are very expensive to rent. When you want to go to work using VR/AR and especially if you want to collaborate in real-time, you need power.
GODBOX Pro goes into production, bringing AMD’s Threadripper Pro WRX80 chipset and CPU with NVIDIA’s professional GPUs and accelerated real-time visualization, together in an enterprise level desktop workstation.
It’s been a journey that started in 2016 when On-Set Facilities (OSF) took their first PCs on to virtual production sets. What started out as a mega gamming rig, built by OSF to power Unreal Engine on-set, is now a complete power edge computing solution, one that provides artists and developers with the power to develop full-spectrum real-time virtual production pipelines, on the desktop, in the studio and on-set. From powering in-camera VFX on LED to designing products and building in real-time in Omniverse or designing 3D applications for AR/VR in Unity, the new GODBOX Pro is the worlds first real-time engine verified professional workstation.
Earlier this week Unity Software Inc, the real-time 3D game develop platform announced a $1.625billion acquisition of Weta digital , Peter Jackson‘s new Zealand based VFX and Technology company. The deal promises to bring the industry famous Weta Digital tools to Unity creators globally. The tools developed by Weta Digital have been used on the likes of “Avatar,” “Game of Thrones,” and “The Lord of the Rings,”.
Epic Games released Unreal Engine 4.27 last month bringing buckets of new features and updates across all sectors. Today we’re looking over the In-Camera VFX pipeline which has had numerous changes in UE4.27, improving both usability and performance. The new version of In-Camera VFX replaces the need of manually configuring config files externally and now presents users with a new interface within the engine to configure layouts, pixel pitches and projection policy’s.
Ndisplay Config Asset – This’ll be the first of new Ndisplay related asset you’ll use it’s found under Ndisplay in the content browser. Ndisplay Config Asset will be where you configure your stage setup and what you’ll use to bring your stage into levels and scenes.
Ndisplay 3D Config Editor – This is the replacement for the previous .CFG config file which was created externally and provided the engine with important parameters like Projection policies, viewport and node configurations. Now the Ndisplay 3D Config Editor does all of the above and provides new features allowing for easier visualization. The interface is broke into 4 specific tabs and a details panel.
- Components – Showing the imported static meshes, frustum camera and sync tick components added to the config uasset.
- Cluster – This tab visualises the cluster layout and hierarchy. Showing specific viewports and their dedicated server.
- Output Mapping – Output mapping provides users with a visualization of the output canvas in relation to the inputted pixel resolutions, whilst also demonstrating which server renders each viewport.
- Configuration Viewport – this tab allows a simple visaulistion of the stage itself and allows users to place the stage’s location accurately in relation to the real world.
Switch Board – Switch board now stands in for the previous Launcher and listener apps whilst also adding a range of new user friendly features. These features include the ability to connect, launch and monitor all running nodes.
At the event horizon of production technology the Director of Virtual Production and the Virtual Production Supervisors – Overall Technical Direction – VP HOD Role – Virtual Production Systems from Script to Screen. VES backlink.
Virtual production systems are integrations of multiple hardware and software technologies that blur the lines between the real and virtual world, in-camera, in system and in the cloud. Just like technology in any other industry, production technology never stops advancing.
What it’s like to see yourself inside the metaverse? From day one we’ve had a fascination for exploring ways to transcend physical space and put people into digital realms.
Now, in our connected world we have The Convergence, the point in time that we see the coming together of multiple live and recorded data streams, transporting metadata, video, audio, depth and more. All this live data is now available in real-time, all open to developers and UX designers, powered by a new generation of local low-latency computing. Plus the network too, via LAN, at the edge via 5G, and in the cloud via super-fast broadband.
Dark Bay a virtual production company released an eagerly awaited behind the scenes photos from the set of the new Netflix show, Babelsberg 1899 the hotly anticipated new mystery series from Jantje Friese and Baran bo Odar, creators of global hit Dark.
Shooting in Berlin this is the first behind the scenes image to be officially released by the production team. It shows the LED stage and the sett of the ship. One of Europe’s largest, permanently installed LED studios for mixed reality film productions has officially opened: the DARK BAY Virtual Production Stage at Studio Babelsberg near Berlin. With the support of the ARRI Solutions Group, ARRI was responsible for technical coordination of the installation and all components, including more than 70 ARRI SkyPanels. “For productions of the highest quality, state-of-the-art hardware was combined with engineering services to create an innovative overall system in close coordination with the client and business partners,” comments Markus Zeiler, Executive Board member of ARRI. First production to be shot in the DARK BAY is Netflix’s “1899.” The new series by Jantje Friese and Baran bo Odar is captured with ALEXA Mini LF and lenses specially created by ARRI Rental.
Image copyright. Alex Forge / Netflix.
When the UE mothership sings the virtues of virtual production.
Unity Technologies add’s GODBOX computers to Unity verified solutions and pull OSF, the Welsh computer company, under its big-tech wing.
OSF low latency computing technology, including the GODBOX workstation, servers and real-time cloud production solutions, are all set to serve as the Unity real-time computing platform. Real-time production is seeing a huge uptake in many industries as companies look to empower remote workforces and move to more efficient, greener, and better ways to work. Sales show real-time production is seeing multi-industry adoption, with media and entertainment, architecture, design, manufacturing and medical applications taking the lead.
More news to come from OSF and Unity.
An official press statement will be made in the coming weeks, but for those of you close to the OSF fire, you can take a look at the OSF solutions page, at Unity.com that went live today.
Our Unity day deserves a mark in our company history.
The OSF GODBOX low latency real-time computing platform is now LIVE on the Unity official web site. Not only that, the GODBOX is the first ever computer platform to be officially awarded verification by Unity Technologies. As Unity Verified Solution Partners OSF who manufacture GODBOX computers in the UK and USA, joins the likes of SONY, HOUDINI and Vrjo. You can see all the Unity verified solution partners at the Unity Verified Solutions Partner page.