With the demand for LED studios blowing up, this week OSF Founder and VP Cinematographer, Asa Bailey, points us towards some usfull alternatives to the ROE BP.2 LED panel.
5 more LED panels you should look at.
“All of these panels have the desired pixel pitch, brightness, blackness, and scan drive options that works for film and TV cameras, and they are also all good for hitting the REC2020 colour space,” says Bailey, DVP and CEO of On-Set Facilities. “All these panels are what you can also call open, as they support a wide number of processing platform choice, meaning you can choose your processing, and take advantage of mixing multiple platforms between multiple productions, based on their availability in the market.”
Some of those below panels get significantly closer to Rec 2020, where previously there were very few panels on the market whose colour gamut got close to the Rec 2020 spec, there are an increasing number today, manufactured with LEDs that can touch the deeper colours, especially if driven by smart dynamic calibration, so that the colour is not restricted by the default factory calibration. “
These 5 panels are good examples, of LED panels for in-camera VFX, in no particular order:
On-Set Facilities CEO and VP Cinematographer Asa Bailey helms virtual productions both on-set and in the cloud working with some of the industries top VP crew and on-set VFX crew, and since the pandemic hit, he’s never been busier. In this post we sit down with Bailey to try and learn more about the role of the Virtual Production Supervisor.
The Virtual Production Supervisor Role
The Virtual Production Supervisor advises the producers and implements agreed virtual production solutions for the benefit of the production. Helping to minimize the production spend and schedule without compromising the vision or quality of the outcome, using virtual production technologies.The Head of the VP Department with a considerable influence over the whole production processworking in collaboration with the Director,DOP, Production Designer and VFX Supervisors.
The Virtual Production Supervisoris responsible for the reliability and functionality of the virtual production systems and workflows, and management of the virtual production both on-set and in the cloud. The grounding skills of the VP Supervisor are more tuned towards computer networking, camera tracking systems, LED and real-time lighting, on-set VFX visualisation, motion capture, virtual cinematography, game engine operation, real-time rendering, broadcast engineering, cloud computing and software development.
NVIDIA Omniverse Multi-user
A Virtual Production Supervisor understands the creative possibilities and technical difficulties in the processes of virtual production and is expected to have an expert understanding of the best ways to use the technology practically and creatively during every stage of production.
A Virtual Production Supervisor understands the production conflicts between budget, creative direction & schedule and can implement virtual production solutions that help to mitigate these conflicts. It is the Virtual Production Supervisor who will represent virtual production in all departments, assisting others to embrace virtual production technologies, workflows and services into their way of working.
During development and preproduction, the Virtual Production Supervisor is expected to offer proposals and suggestions toward the overall creative direction of a project, including suggesting new virtual production methodologies and processes and ways of doing things, propping technically minded ideas that may impart a creative benefit to the whole production.
During preproduction, the Virtual Production Supervisor is responsible for managing and supporting any multiuser connectivity to the collaborative cloud workflows including working with the VFX Supervisor to prepare assets for virtual scouting and providing technical support during virtual scouting sessions.
During production, the Virtual Production Supervisoris responsible for maintaining a stable real-time virtual production environment on-set and in the cloud. Including the reliable recording of all digital VP data, such as Engine Takes, FBX, FIZ, motion capture, and pre-vis masks & composites, as well as ensuring the integrity of data transfer to the VFX team and on-set digital information technicians.
During post-production, Virtual Production Supervisor is responsible for all recorded virtual production data and for coordinating and running any post-viz sessions using multi-user virtual environments, as required for post-production.
More Virtual Production Crew Roles
The Virtual Production Technical Director reports to the Virtual Production Supervisor or VFX supervisor and is responsible for the functionality of the virtual production systems and workflows, including the communication and management of the virtual production systems team.
During development and preproduction, the Virtual Production Technical Director is responsible for supervising multi-user virtual scouting sessions and for implementing and supporting the digital pre-visualization working environment.
The Virtual Production Technical Director is responsible for any multiuser connectivity to the collaborative cloud workflows including working with the VFX and 3D content development teams to assist in preparing assets for virtual scouting. The VPTD is also responsible for ensuring that their is sufficient technical support to assist any crew members who may need it, during on-set and cloud virtual production sessions.
During production, the Virtual Production Technical Director is responsible for maintaining a stable real-time virtual production environment on-set and in the cloud.
As well as assisting the Virtual Production Supervisor to invent and develop VP solutions,it isthe responsibility of the Virtual Production Technical Director toprovide a reliable VP system for the recording of all digital virtualproduction data, such as Engine Takes, FBX, FIZ, motion capture, and pre-vis masks & composites, as well as ensuring the integrity of data transfer off-set for any necessary post-production.
During post-production, the Virtual Production Technical Director is responsible for delivering all recorded virtual production data and for coordinating and running any post-viz sessions using muti-user virtual environments as required by all departments during post-production.
Virtual Production Developeris responsible for developing new virtual production software and hardware and its integration into virtual production systems.
Virtual Production Technicianis a specialist operator of virtual production hardware and software and its integration into virtual production systems.
Virtual Production Camera Assistantis responsible for the virtual production hardware at the camera end of any virtual production system.
System Networking Engineeris responsible for specifying and maintaining all network connectivity between virtual production hardware and software.
Virtual Production Produceris a producer who understands the tech and what it takes in time and money to execute. Responsible for budgeting VP systems for productions, studio installations, events and wherever VP crew and technology come together on-set and in the cloud.
Virtual Production Director A Virtual Production Director is a Creative Director who specializes in directing both on-set and in the cloud.Hollywood Directors who could be classed as Virtual Production Directors include James Cameron, Jon Favreau and Robert Zemeckis.
Virtual production is digital production 2.0
In the early 1990’s production computinghappened, and it happened locally(digital cameras, desktop publishing, rendering).Whereaswith virtual productionthese local processes now connectto the cloud,bringing all the benefits of cloud computing to every production that’s connected.
Where local and cloud computingcome together in computing terms is called the “Edge” and where the Edge exists itis disrupting not only the content production industry, butall industries.
At the Edge wherehumans, computing andhigh-speed networks combine they enable virtual worlds (services and applications)to be built. These virtual worlds areoften collectively referred to as the multiverse, metaverse or omniverse.
Virtual Production Technology can be quantified by its ability to aid and enable the action of virtual production. Virtual production technologies when combined create virtual production systems. These systems are designed to provide specific virtual production functions, workflows and services.
Asa Bailey CEO, On-Set Facilities
Virtual Production Director / Supervisor
I wanted to write a post about something very close to my heart. Writing. As we all ramp up our on-set tech and some of you are now well and truly cracking open tins of UE4 woop-ass, I want us to think about how virtual production will be adopted by writers and change the art of script writing.
I have now clocked up hundreds of hours shooting on professional virtual production sets, and on my way I have written over 15 screen-plays, I even had a fiction novel Vampire of Highgate published alongside Twilight and The Vampire Diaries. What I am trying to prove to you is, I may be known for being as dyslexic as a badger, as the techno DP and for building crazy powerful VP systems, but I also really know my apples when it comes to writing. So will you please join me as I take a look at how the craft of screenwriting for virtual production may pan out.
How to write a script for virtual production
You can set your story anywhere in time or space.
The first thing is, with virtual production we can be virtually anywhere, both in time and in space so this gives writers an insane amount of freedom. It’s no more expensive to soot in the future on Mars thanit is to shoot a contemporary scene in New York back ally. In fact, in virtual production terms, fantastic locations are in fact easier to pull off in-engine in real-time.
Reality in virtual production is what costs. Just take a look at digital humans. Most digital doubles in virtual production are quite frankly comical, to do a digital human justice it still takes a massive amount of pollies and human skills in post. Reality in virtual production is expensive, you have to either shoot lots of referencesusing photogrammetry or even laser scanning, then you gotta model and texture and light and rig and, oh it goes on.
Or you can download a Mars scene of the Unreal Market, download a charter in to iClone, jump in your Rokoko mo-cap suit and get going in under, say 4 hours. Fantasy scripts, funny ideas, amazing dialogue, and out there locations all win as of the virtual production landscape today. In time as assets stores become better stocked with more Quixel grade assets, but as 2020 the wacky and fantastic is cheaper to pull off, realistic is shall we say a little more challenging.
But if you got the budget, hell, who am I to say what can and can not be done, just look at the big budget virtual productions from Disney – insane good (Jungle book, Mandalorian) but maybe not the best written? (discuss). Also the really good thing about VP is, you won’t even notice it 99.9% of the time. There are absolutely loads of TV shows, films and series that have at least virtually produced scenes in them. All I am saying is, if you want to make something with your mates, low budget, think like a crazy person and you’ll have abetter chance of pulling something off.
Reallusion 3D Character Maker and Creator | iClone”
Then let’s look at cast. Sure writing a sprawling crowd scene for virtual production is going to be a challenge to pull off, unless you have a 140 foot Mandalorian style LED stage and they cost about $12,000,000 to build.
But if you do have the multi-zillion budgets, you can pull a well costumed crowd into your LED set and fill your cinematic boots. Otherwise (as part of the point of VP is that it at least at some level is mass democratized) you will not have the big-Z budgets and are going to be writing for your main actors surrounded by out of focus digitally animated extras, digital sets, if you’r lucky, projected on an a 16m x 16m LED. Or maybe your script will be produced using green screen mixed reality, I’ll come on to this.
If a film is all about place and action, writing for virtual production should be no different, we just need to think about some of the technology that may be used to make the film. Now I can hear a lot of great writers out there turning their noses up and saying “But Bailey, you should never put technology before story or drama, my dear” and in a way they are right, but what I am saying is, by understanding the technology, it’s limitations, writers can at least lean-in to the positives of virtual production. Think about what the technologies can do.
Writing scripts for green screen mixed reality productions.
Writing for green screen productions is a tricky one if your producers want to make the film using real-time compositing. Fur costumes, not a great idea, fast action, also tricky. I have show over 400 hours on a mixed reality set and in my option and in my experience “steady as she goes” is a good rule of thumb when writing for green screen mixed reality shooting. That is as far as the tech goes today. Maybe tomorrow it will be different and better, but live compositing does not like fast action, furry stuff or anything that will create blur. If you do shoot any of these on a green set, your real-time compositing will be good for slap-comps in post. They will save your post team a lot of time, but you will not be getting final pixel recording.
Think of the best script for virtual production.
“If it was me writing a new script for virtual production studios, I’d think about great films that could have been made using real-time virtual production technology – Gravity!” Asa Bailey CEO of On-Set Facilities It would make the perfect VP script, two actors a tin can and a set thats so fantastic it does not have to be judged for it’s realism. No crowd scenes, no chase scenes, not even that many wide shots with the actors in them. Yet it is an AMAZING film.
So on to my round up for now…
The new golden age of Whatever TF screen or device your on
Writing scenes for virtual production – my advise for now at least, would be to think in production terms as if you are writing for the old Hollywood studios of the 30’s and 40’s. When big studio sets where used with painted backdrops that could be anywhere. Imagine your characters, where are they, what might the staging be, what might a Director do with your scene, how can you help to write for the technologies that may be used. What if your actors are even in different locations, for we are in funny times where it’s now possible for actors to be shot at home and projected into a scene remotely. Write tight, write compact, and then go big with fully digital VFX scenes that can now be rendered in real-time.
So this is a bit of an appeal.
Dear writers, please don’t freak out when you’re asked to write for a virtual studio production. Download yourself some great old movies and just watch how they did it back in the day. Then lose your mind and get fantastic. You never know, we may be entering a renaissance of truly great storytelling.
In this post Director of Virtual Production Asa Bailey gives his views and a comparison between various methods for creating realtime reflections on set and why he favours those methods that will still give him options in post.
At this years NAB there was a bit of a battle brewing with some folks claiming that their new “final pixel” virtual screen walls would make green screen shooting and in fact post production a thing of the past. I don’t think so, but I have used both on set in virtual productions, I’d say they are both valid options for shooting. It’s about using the right tool for the shot.
In my view, the big problem with virtual wall technologies is that the footage is “baked” in camera and sold as “final pixel” just as they guys say in this video, and this leaves a lot less, if any options for layered work (post VFX passes) in post.
But to their credit Virtual Walls do give amazing reflections, to faces, eye’s and highly reflective surfaces (glass, paint, mirrors etc) as show in this demo from UE and Stargate Studios who’s done work on some big projects, so they do know a thing or two about VFX.
But they sing the praises of ‘final pixel” in camera is king for both creative and financial reasons. But come on guys, you know as well as I do that in the real world, in the reality of working in a studio pipeline, with studio Producers, Agencies, Talent and god knows who else who wants to have an input in to the final product, loosing the option to post process shots is too big a risk for Producers.
So would I shoot against a virtual wall, if the scene needed it and it would give the best result yes, would I use them to generate a realistic reflection on someones face who was say looking out of a window – yes I totally would and do. But on my sets they would be out of shot, so I could really use any sort of screen, it would not necessarily have to be connected to my virtual production rig and offer realtime prospective, a nice bonus but not essential.
Matching the lighting on-set to the virtual set in Unreal Engine.
But as a Virtual Production Supervisor or as a Director either way it’s my job to make sure that we leave the set with options, and thats why I shoot realtime in layers with a composite as an option. 80% of my shots are ready to grade and edit with minimum clean up in post. The other 20% need critical changes as often driven by other stakeholders (Agencies, Producers etc).
Shooting in layers protects the Producers investment while cutting costs at the same time.
We shoot the talent layer (what the optical camera sees) and then we record the UE4 generated backgrounds as plates along with other realtime VFX layers on separate data layers, so that we can open them up later and VFX passes as required, indeed if required. We have the choice we can go to grade an edit with the composite files that are recorded on 10bit 4.2.2 with audio, ready to drop onto any edit timeline, or we can open them up and add more VFX passes, we can even regenerate UE backgrounds using our tracking (FBX) data, so if we decide that we don’t want that tree in the background anymore, thats ok, we simply remove it and then run the BG layer again shooting with virtual cameras in UE and export the shot and drop it back in our layer stack.
A neon lit tunnel in Unreal Engine that the cast will appear to walk down.
Realtime reflections are a challenge, in fact lighting in virtual production is a challenge full stop and you have to have done it many times to know what works, as they say practice makes perfect. My Gaffers and DOP’s have been lighting green screen sets for for years and they know where to put the light and how to spill just the right amount of light on a subject (optical) to make it match in the final realtime composite, but the point is if we need to, we can open up our recorded layers (optical, background, matt, foreground, composite) in post and fix any lighting issues. You may think this defeats the object of realtime production, but realistically realtime is a turbo boost to any production, it is not a golden egg laying goose. 9 out of 10 times you’ll still want to go into post production options especially if you are shooting for high end streaming shows, feature films or big brand commercials. For me, I want to shoot in realtime virtual production methods, but I also want the options to be able to take advantage of 100 year old VFX industry and all the amazing talent it holds.
Methods For Creating On-Set Lighting for Virtual Productions
Back to creating reflections in realtime, I prefer real lights rigged up on set to mimic the light that would be there in the virtual set. As you can see in the images below we had an actor walk down a neon lit corridor (the UE background) on-set we set up a number of lights to mimic the virtual set lights, so that our optical layer had inherited the scenes lighting. Having some physical lightingon-set that matches as best as possible the light you’d get if you was on the virtual stage is important. In post it really helps if there is something close to work with.
On-set virtual production system and studio lighting.
If the reflections need to move like in a driving scene, I use 3 methods.
1 – Moving practical lights
We rig up stage lighting with bells and temperatures set to match the virtual world fixed to some sort of mechanical rig that will move the lights. Or we set up moving flags infant of the lights to mimic passing buildings etc.
2 – Screens
We’ve used multiple large LCD walls to generate a reflective images. Usually positioned out of shot (so we can still shoot for chroma) the screen projects a large layer of realtime light onto the scene, we’ll run the background on the screen from within UE or as a simple video file playing while we do the take.
3 – Projectors
We also use projectors projecting moving backgrounds, in fact this is my favourite way to cast realtime moving reflections on my cast. You can use scrims and diffusion to soften the projection, the light from the projection on it’s own is often very hard, but this gives you another creative tool to tweak and get just right for your look.
I hope that helps, as I say practice makes perfect and there is no one right way, it’s about the shot, the vision, the methods and obviously your budget, but any of the above approaches can be done on any budget, its just a matter of scale and complexity.
Reported in the top trends at NAB this year, Unreal Engine made huge inroads into the production workflows of all the usual suspects in broadcast virtual studio and graphic production.
On looking at the exhibition halls, this trend looked more like a full on invasion of the engines, with every other booth demonstrating how they are integrating with a realtime virtual engine of some sort or another.
Let me tell you why I think this is the best thing for audiences since the invention of colour TV and will bring back a love of TV in our homes. But warning, I only have about 8% battery life on my laptop, so I’ll be quick.
Let me bring you home from the NAB show. To my home to be precise. I recently bought myself a run of the mill Toshiba 4K TV, at a cost about £370. I know I’m a bit of a Mc Scrooge laggard when it comes to my home Tech. I have enough of bleeding edge in my tech grotto where we build our solutions, so my home is a bit of a shrine to vintage tech and incense sticks.
Anyway, we put the new TV in place switched it on and WTF!
The standard definition channels of British TV looked terrible and HD channels don’t look much better, never mind the picture quality, make-up looked shocking in our usual soaps and the stage sets, well they look like they have been built by prisoners on day-release. I’d never seen any of this with my rickety 10 year old Sony HD TV. I never needed up-scaling hacks or video processing, I just watched TV and veged-out!
Bad lighting, old fashioned sets, national TV looks out of date in 4K.
It seems only content shot in a cinematic / creative way and broadcast in HD had any chance on this new 4K so called dream machine that was wrecking my home. Then I hoped over to watch a trendy Nordic crime show on the built in streaming providers and things started to look a bit better, we tried wildlife and yes that did look good. But for the main UK TV shows being broadcast on the standard channels everything looked pretty crap.
It was not just about the picture quality, my point is this is a creative production problem too, as when we could view true 4K content (not much about yet) the sets, shots, make-up in locations normal run of the mill TV productions looked bad. The only thing that looked good was high end or blockbuster content.
But this is a bad thing for local TV, community diversity, people don’t just want to watch Game of Thrones, on a Saturday I like to watch Casualty along with millions of other UK normalites, but Casualty needs too make some production changes, maybe a few virtual sets would help?
Virtual Sets in 4K look amazing on 4K TV
Then last week we (OSF) exhibited at our first academic roadshow looking to hire a few new engineers for the upcoming product launch and we took our new 4K TV with us to show off our work. OMG what a difference, our virtually produced content looks AMAZING on this domestic 4K TV. Not only did the resolution look good, the higher capture also seemed to work aesthetically with our sense of what 4K TV should look like. We’d watched our content on those small hi-end production monitors in post and we knew it looked good, but we’d not run it on a common garden 4K TV, the sort millions of folks now have at home.
100M 4K UHD Sets Sold Worldwide in 2018. New research from Futuresource Consulting projects more than 100 million 4K UHD televisions will be sold worldwide this year, returning consumer demand for TVs to positive growth.
Captured at 4K 10bit 4.2.2 in camera RAW, putting our content on a normal cheap domestic 4K TV proved to me that virtual production is the future of 4K and 8K TV content production, if nothing else it matches my expectations of what high resolution TV should look like. To sum up, virtual production leaves traditionally produced TV shows looking like they are shot on a phone, in your Nans front room, by your Nan, unless your Nan happens to be Roger Deakins.
When it comes to making content for 4K and 8k TV, I’m going to say Unreal is better than real.
With the release of Love, Death + Robots Netflix has brought the abilities and talents of the 3D artists to the mass markets, showcasing just how far animation has come on in recent years.
The hit series showcased a variety of of different styles of animation from the Neon photorealism of “Sonnie’s Edge” to the hand painted texture of “Suits”. These different arts styles in each one were what made the series so beautiful and intriguing. The series almost felt like a showcase and “look book” for the studios which made each episode.
The production was lead and powered by V-Ray for its ability to give faster and light reading renders which were able to provide the photorealism which was wanted in “Sonnie’s Edge”, “Beyond the Aquila Rift” and “Helping Hand” but it also came to play in the more stylised animations with its lighting renders still keeping to the aesthetic of the episode.
The Studios and Facilities that worked on Love Death + Robots
If only we’d have had the chance to join this list of studios. We’d have put real humans in realtime rendered sets like we did below for NTT. Combining realtime rendering with live action on-set and virtual cinematography we’d have spun up our own twisted robot world.