Posted on 1 Comment

Unity Acquires Weta Digital and What it means for Virtual Production?

Weta Digital Unity

Earlier this week Unity Software Inc, the real-time 3D game develop platform announced a $1.625billion acquisition of Weta digitalPeter Jackson‘s new Zealand based VFX and Technology company. The deal promises to bring the industry famous Weta Digital tools to Unity creators globally. The tools developed by Weta Digital have been used on the likes of  “Avatar,” “Game of Thrones,” and “The Lord of the Rings,”.

Unity plans to bring these previously unattainable tools to users through a cloud based work-flow. The exact pricing strategy is yet to be released but it’s rumored to be a software as a service business model for enterprise users. Weta Digital had previously toyed with the idea of commercializing the tools independently but decided Unity was the best route bringing the tools to market, with it’s evident scale and cloud-oriented strategies.

Peter Jackson said in a statement, “Weta Digital’s tools created unlimited possibilities for us to bring to life the worlds and creatures that originally lived in our imaginations. Together, Unity and Weta Digital can create a pathway for any artist, from any industry, to be able to leverage these incredibly creative and powerful tools. Offering aspiring creatives access to Weta Digital’s technology will be nothing short of game changing, and Unity is just the company to bring this vision to life.”

What are Weta Digital Tools?

Here’s a sneak peak as to what exactly Weta Digital’s tools are:

  • Manuka: Manuka is the flagship path-tracing renderer used to generate final frames and is able to produce physically accurate results based upon specific spectral lighting profiles.
  • Gazebo: Gazebo is the core interactive renderer used for viewing scenes in real time with visual fidelity inside any pipeline attached application. Since the Gazebo real-time rendering of the 3D viewport approaches the same results from Manuka, artists can iterate in context of the final frame regardless of which application they use. Gazebo is also the core of the production pipeline for pre-visualization and virtual production workflows.
  • Loki: Loki provides physics-based simulation of visual effects including water, fire, smoke, hair, cloth, muscles, and plants. Physical accuracy for complex simulations is delivered through the use of cross-domain coupling and high-accuracy numerical solvers.
  • Physically-based workflows: Tools including PhysLight, PhysCam, and HDRConvert provide the foundation for lighting and color workflows. Using these tools, artists can create spectral-based lighting and accurately replicate effects of different lenses, sensors, and other parts of the pipeline, resulting in a physically accurate rendering workflow for both Gazebo and Manuka.
  • Koru: Koru is an advanced puppet rigging system optimized for speed and multi-character performance. Using Koru, technical directors and developers can create constraints, rigs, deformers, and puppets to support high-performance animation, cloth simulation, and similar applications.
  • Facial Tech: Facial Tech provides advanced facial capture and manipulation workflows, using machine learning to support direct manipulation of facial muscles and transferring actor face capture onto a target (puppet) model.
  • Barbershop: Barbershop is a suite of tools for hair and fur that supports the entire workflow from growth through grooming. Artists can use a combination of procedural and artist-guided tools to grow hair and fur, adjust growth patterns, and groom the final model. Advanced procedural tools support concepts such as braided hair, and the resulting models are simulation-ready to provide realistic dynamics resulting from motion and wind.
  • Tissue: Tissue enables artists and animators to create biologically accurate anatomical character models that accurately represent behaviors of muscle and skin, and transfer the resulting characters into simulation tools.
  • Apteryx: Apteryx provides artists with a complete workflow starting with procedural generation of feathers, hand sculpting, and grooming for animated feathered creatures and costumes.
  • World Building: These tools include Scenic Designer and Citybuilder to support world building, layout, and set dressing ranging from planet-scale to small-scale scenes. With these tools, artists can procedurally create scenes with node graphs, place content programmatically, and manually adjust placement.
  • Lumberjack: Lumberjack provides the core toolset for vegetation and includes modeling, editing, and deformation tools. Using Lumberjack, artists can author and edit plant topology including animated geometry, manage levels of detail, instancing, and variability among individual assets.
  • Totara: Totara is a procedural growth and simulation system for vegetation and biomes that integrates with Lumberjack to create large-scale and complex scenes procedurally. Using Totara, artists can grow individual trees and entire forest biomes, grow other vegetation such as vines, adjust growth parameters and control biomechanics, add snow cover, and reduce the complexity and size of scenes.
  • Eddy: Eddy is an advanced liquid, smoke and fire compositing plug-in for refining volumetric effects. Eddy allows artists to generate new, high-quality fluid simulations and render them directly inside their compositing environment.
  • Production Review: HiDef and ShotSub are the foundation for production review. HiDef is a core tool for production review, with features for note taking, version browsing, and more, integrated with a color-accurate browser and playback engine. ShotSub is a core tool for production review, with tools to prepare artist work for review with the appropriate color space, frame ranges, and settings for frame rate and resolution.
  • Live Viewing: Live viewing tools support the mixing of computer-generated (CG) content in real-time with on-set camera feeds. These tools support live mixing for on-set viewing, live compositing of CG elements onto chromakey or other CG elements, depth-based live compositing and projection of face capture onto a motion capture puppet.
  • Projector: Projector is a production tool supporting scheduling, resourcing, and prediction, with controls for data access and analytics to improve production decision-making.

What the acquisition means for Virtual Production?

The acquisition shows Unity’s obvious aim of growing a VFX industry based footprint, this ambition and strategy will most definitely span over into the current virtual production sectors. Whether Unity plans to go up against Epic Games in the current LED Volume market is unknown, but we would expect to see a boost in the other sectors of virtual production, e.g  real-time compositing, full virtual and AR productions.

Unity’s pricing and distribution strategy for the Weta Digital tools shows another major 3D player converting to a cloud-based offering solution, with the likes of Nvidia providing the Omniverse development platform. Cloud-based software as a service models seem to be an emerging trend, democratizing hardware and enabling more users globally.

See Unity’s official announcement here.

1 thought on “Unity Acquires Weta Digital and What it means for Virtual Production?

  1. […] be offering it’s users with the recent acquisitions of both collaboration platforms and the Weta digital tools, that we already know are planned to move to a cloud based […]

Leave a Reply

Your email address will not be published.