Metaprojection is a way to see yourself in the Metaverse

What it’s like to see yourself inside the metaverse? From day one we’ve had a fascination for exploring ways to transcend physical space and put people into digital realms.

Now, in our connected world we have The Convergence, the point in time that we see the coming together of multiple live and recorded data streams, transporting metadata, video, audio, depth and more. All this live data is now available in real-time, all open to developers and UX designers, powered by a new generation of local low-latency computing. Plus the network too, via LAN, at the edge via 5G, and in the cloud via super-fast broadband.

So why hasn’t everyone dived into VR yet?

It could be about something called usability. There’s a fact we all must face. We are human and humans are built of flesh, and flesh don’t digitise too well. We are also very flighty and we can smell a fake a mile off, it’s even built in to our brains to spot a lie, to save our lives.

But for now, instead of dematerialising our physical body (which one day we may well do), we instead think of the obvious thing to do as engineers, we design a pair of glasses to pop on our little human faces, to see an alternative reality, we try to trick our brains into thinking we are somewhere else. What could go wrong with that idea. Makes sense, at least to engineers and billions of investment.

VR and AR HMD systems and why they may not work.

So here’s why I think Virtual Production may be another major gateway to the metaverse. With a HMD you always still know you are ‘in VR’ when you put your headset on, take a VR tour, or play a VR game you know, you can feel the lie, your mind says one thing your body says another.

It’s no different with AR too, You know the graphics aren’t there when you see them on the street, they don’t smell you cannot feel them, but still AR does feel somewhat better for users than VR. They are great for short amounts of time and for doing certain tasks, but HMD’s are not for everyone. Here’s an idea we’ve been toying with.

People believe what they see right?

In our LAB in North Wales, we’ve been looking at how to project people into the metaverse.

Instead of trying to trick the brain into thinking it’s inside the metaverse when it’s not, we’ve been working with manufacturers to create virtual production systems that use the human brains natural instincts (programs) as the viewing device. We’ve been creating environments (mostly sets for productions) that feed the brain, through the eyes, with the right stimuli, to really make humans feel like they are in two places at once.

We call it Metaprojection.

This ‘remote viewing’ approach of Metaprojection lets users see themselves working or playing inside the metaverse, but as an active observer. Which, given the feedback from our users, seems almost as good as actually being there, with less disorientation and more joy.

In our latest VP pipelines we’ve used both HMD and Metaprojection, where users can move between observing themselves working in VR in first person using the VR headsets or third person, seeing themselves replicated in the engine as a photo real projection of themselves, on 2D screen in and around the studio.

Metaprojection can be recorded live as layered or composited content (currently 4K 422 video, FBX, FIZ, ARRI RAW etc) which is how our research has become a favourite on-set tool for VFX Directors and VFX houses, who use Metaprojection methods to previsualise actors in fantastic worlds.

Once we are projecting someone into the digital space, we’ve then set up ways for the person being projected to observe themselves. We do this using 2D viewing technologies such as large projectors in the studio, or using multiple high-definition screens, where users can watch themselves interacting in the metaverse. Now we have the technology, the next step is for us to do more user research.

These sessions using Metaprojection to see yourself in the metaverse are as yet an elaborate and expensive exercise, and we could benefit from the miniaturisation and further adoption of 4K direct to system video instead of high end cameras recording to disk. The user being themselves not only in a real sense, but also in a cinematic sense may also add to the UX by making the participant feel good about themselves.

For now, we’ll still be putting on the HMD’s for our virtual scouting sessions but more and more we are starting to use Metaprojectiion as a way for people to collaborate, visualise and tell stories together in digital space.

 

Leave a Reply

Your email address will not be published. Required fields are marked *