At Axis Studios our experience spans twenty years in animation and VFX. We constantly strive to increase quality, creativity and innovation in all of our projects, from our BAFTA winning episode ‘Helping Hand’ from Netflix’s Love, Death & Robots to our Clio award-winning game trailer for Arkane Studios’ Deathloop. 

In the early days of CG we wanted to do the impossible. This meant conjuring perfect worlds - with flawless lighting and clean sweeping shots. Now, the new goal of CG is reality. Imperfections make stories feel more real, and in turn, their universes more believable. With this in mind, our team has been exploring the popularity of virtual production and the benefits of real time animation. 

The Popularity Of Virtual Production

Over the past few years virtual production has picked up massive momentum, becoming more advanced and more accessible than ever. In 2009, James Cameron spent a huge amount of money to use virtual production on the set of Avatar. Today, it’s engines like Unreal and Unity that are going head to head in the battle for the virtual production crown, implementing newly-applied technology in productions such as Game of Thrones, The Lion King and Avatar 2.    

Using A Virtual Camera

The inspiration for our virtual camera setup has come from our desire to build the best story we can. It gives us the opportunity to provide creative control and direction to the director the same way a live set would, and to explore those ‘happy accidents’ without sacrificing the storytelling or production timeline. Using the virtual camera gives extra refinement and creative freedom to all our artists. 

Thanks to Epic, setting up the virtual camera wasn’t too difficult. We started out with an iPad, using AR tracking to obtain a first example of the virtual camera up and running. However, a mixture of lag issues and the lack of tracking points in our capture space meant that we needed to find another solution. We then switched to a Vive tracker, using our room scale HTC Vive setup. This was the demo that we had wanted, a tracker that was smooth and accurate.

Deathloop And Gears 5

Having already brought Axis' 20 years of storytelling into realtime with Amazon's Lumberyard trailer, Axis' first IP the bOnd (nominated for a Cannes Lions & VRAwards VR film of the year) and Gears 5 with seamless in-game cinematics, we wanted to expand our storytelling, leveraging our realtime skills and tech to help us with our pre-rendered content.

Deathloop was our first in-production test. Using the camera across multiple shots where we brought in the sets and characters from Layout for filming in Unreal. Matching each sequence edit in engine allowed us to capture each shot and immediately review the cameras in a full sequence edit. The camera style and motion lent itself very well to the virtual camera, leading us on to using it in the Gears 5 Escape Trailer. 

For the Gears 5 Escape Trailer, we shot 150 takes in the space of 1 day, working with a larger team, with the CG Supervisor, Producer, Cameraman and Director all watching live. This matched a more on-set experience similar to a mocap shot. Calling out prefered and alt takes and recording videos for selection. 

Shots that would be painstaking to animate realistically became as quick as the shot is in realtime, allowing five takes in the space of five minutes or less. It also meant we could try alternate shots quickly and easily. We could then see the edit by dropping in the recorded cameras, and review the shot, sequence or even the whole show in context, live in Unreal in realtime. The ease of additional iterations means that better choices and better quality can be achieved without the need of massive rendering hurdles. Gears 5 proved our camera system worked well under production pressures, and could be used for final cameras in our pre-rendered pipeline. 

The Benefits Of Virtual Production 

Virtual production offers a solution to guess-work, allowing production teams to visualise live footage with CG backplates, characters and performances. This year Unreal released a demo of a live set recording using nDisplay and on set LED panels, which allowed the Director of Photography to capture in-camera final footage of the actor, and his live tracked CG background and lighting. Not only did this replace the need for a standard green screen, but it also meant no post-compositing of CG renders, with 90% of the work done in camera. CG could then interact with the real world, reducing the time needed to match CG footage integration and grading. 

We’re excited to be at the forefront of how the industry is changing and evolving alongside advances in virtual production and other technologies. We’re proud to have several upcoming projects at Axis that utilize virtual production - watch this space! To keep up to date with our news and events, read more here