Nvidia Broadcast is one of few software’s which provides a AI driven compositing solution to users. Nvidia Broadcast allows users to apply effects over a in-coming camera feed, in this demonstration we used the “replace background” feature allowing us to get a composite without the need of any greenscreen; the way Nvidia archives this ability is by harnessing the power of both machine learning image analysis and depth perception.
Once we have our composited green background in Nvidia Broadcast we move forwards to streaming the feed into Unreal Engine. From the real-time engine we have the ability to place the talent into any 3D environment and now aren’t limited to the previous 2D dimensions.
At the event horizon of production technology the Director of Virtual Production and the Virtual Production Supervisors – Overall Technical Direction – VP HOD Role – Virtual Production Systems from Script to Screen. VES backlink.
Virtual production systems are integrations of multiple hardware and software technologies that blur the lines between the real and virtual world, in-camera, in system and in the cloud. Just like technology in any other industry, production technology never stops advancing.
What it’s like to see yourself inside the metaverse? From day one we’ve had a fascination for exploring ways to transcend physical space and put people into digital realms.
Now, in our connected world we have The Convergence, the point in time that we see the coming together of multiple live and recorded data streams, transporting metadata, video, audio, depth and more. All this live data is now available in real-time, all open to developers and UX designers, powered by a new generation of local low-latency computing. Plus the network too, via LAN, at the edge via 5G, and in the cloud via super-fast broadband.
Dark Bay a virtual production company released an eagerly awaited behind the scenes photos from the set of the new Netflix show, Babelsberg 1899 the hotly anticipated new mystery series from Jantje Friese and Baran bo Odar, creators of global hit Dark.
Shooting in Berlin this is the first behind the scenes image to be officially released by the production team. It shows the LED stage and the sett of the ship. One of Europe’s largest, permanently installed LED studios for mixed reality film productions has officially opened: the DARK BAY Virtual Production Stage at Studio Babelsberg near Berlin. With the support of the ARRI Solutions Group, ARRI was responsible for technical coordination of the installation and all components, including more than 70 ARRI SkyPanels. “For productions of the highest quality, state-of-the-art hardware was combined with engineering services to create an innovative overall system in close coordination with the client and business partners,” comments Markus Zeiler, Executive Board member of ARRI. First production to be shot in the DARK BAY is Netflix’s “1899.” The new series by Jantje Friese and Baran bo Odar is captured with ALEXA Mini LF and lenses specially created by ARRI Rental.
Today Ryan Bellgardt and Emily Taylor from the Oklahoma-based Boiling Point Media visit the ARRI Creative Space for a meeting with On Set Facilities Strategic Partner Elektrashock/OSF. Producer/VPS Darnell Williams, founder of On Set Facilities Strategic Partner Elektrashock/OSF, meet at the ARRI Creative Space in Burbank.
Unity has today announced its acquisition of Interactive Data Visualisation, the company best known as the creator of SpeedTree. The current plan is for the SpeedTree technology to become more deeply integrated into the Unity ecosystem over time, starting with the engine’s upcoming 2021.2 release.
As explained by Unity’s R&D SVP Ralph Hauwert, “creating natural, organic-looking environments is currently a costly and labor-intensive process. If you’re pursuing a high level of detail and realism, the process of manually creating one tree – not to mention a forest – can take upwards of four months.”
Unity Technologies add’s GODBOX computers to Unity verified solutions and pull OSF, the Welsh computer company, under its big-tech wing.
OSF low latency computing technology, including the GODBOX workstation, servers and real-time cloud production solutions, are all set to serve as the Unity real-time computing platform. Real-time production is seeing a huge uptake in many industries as companies look to empower remote workforces and move to more efficient, greener, and better ways to work. Sales show real-time production is seeing multi-industry adoption, with media and entertainment, architecture, design, manufacturing and medical applications taking the lead.
More news to come from OSF and Unity.
An official press statement will be made in the coming weeks, but for those of you close to the OSF fire, you can take a look at the OSF solutions page, at Unity.com that went live today.
Our Unity day deserves a mark in our company history.
The OSF GODBOX low latency real-time computing platform is now LIVE on the Unity official web site. Not only that, the GODBOX is the first ever computer platform to be officially awarded verification by Unity Technologies. As Unity Verified Solution Partners OSF who manufacture GODBOX computers in the UK and USA, joins the likes of SONY, HOUDINI and Vrjo. You can see all the Unity verified solution partners at the Unity Verified Solutions Partner page.
What’s all the fuss about? Well, if you work in real-time production you’ll know how annoying it is that most tools require you to jump in and out of play mode to get the best results. On-set this his a pain and barrier to rapid on-set development and responding to the crews needs. So when we look at an engine we want to be able to test what it renders like – out of play as well as in-play. It is for these reasons that we created GODBOX
Asa Bailey is Director of Virtual Production known as one of the earliest pioneers of real-time virtual production. Over recent years he’s become an influential arbiter of the virtual production user experience. Essentially, Bailey puts technology through its paces before it goes anywhere near a set.
In this video Asa takes an off the menu virtual camera, for a spin in Unity. Using the camera tool plugin to gain control of the Unity GAME camera (the good looking viewport) using the camera tool plugin, Asa was able to take control of the Unity camera by selecting ay viewport, hitting a hot key and then taking control of the Unity camera and flying about. All by way of moving the camera with the AWSD keys. Asa was easily able to use the usual in editor camera controls to adjust the camera acceleration, and moving to the volume panel for all the exposer and bloom.
OSF Founder and Director of Virtual Production Asa Bailey, discusses virtual production with Jamie Allan, NVIDIA’s Media, Entertainment & Broadcast Industry lead – EMEA. Recorded live on behalf of the 2021 Media Production and Technology Show.