Real-Time Motion Capture

London

The brief

With Virtual Production merging the real world with the virtual, productions have mostly used these new tools to add background in order to place real world actors into an environment. What has not been thoroughly explored is having real people interact with virtual characters. Final Pixel devised a research and development shoot in order to take a live motion capture character and have it interact and work with a real-world actor. In this process, we worked to overcome all of the technical barriers that would be included in order to take a human and place them in a different skin on the virtual world. We learned numerous best practices for working with the inadvertent lag between mocap actors movement and the one screen character movement.

Location
London
Services

Virtual Production
Production
Virtual Art Department

Final Pixel successfully achieved live body and facial motion capture streamed to Unreal and played through Disguise, using cluster rendering to render a high quality bespoke 3D character built using a traditional CG pipeline with an extremely high level of detail. We were able to create real-time interactions between the characters in-camera with no noticeable latency for the viewer.

The live body and facial mocap performance and composite were extremely smooth with minimal latency. So from the perspective of what they set out to achieve, this test was a huge success. It was designed to throw challenges at the systems and break things, which it did - and we now know what to do next time to improve - the very essence of doing R&D.

Potential uses of this approach are many and significant:

- The most obvious is in the virtual production pipeline for film, TV and advertising, like at Final Pixel, allowing for live interactions between digital and human characters, all filmed in real-time and in-camera.

- Live-action mocap with creatures and characters which can then be replaced by full-scale CG in post - thereby capturing more ‘natural’ actor reactions and engagement verses use of a green screen.

- Create increased fidelity augmented reality plates using the enhanced functionality of Disguise as a stage management tool, in particular for live broadcasts. To put this innovation in industry context, the recent advances in NVIDIA graphics processing with the A6000 driven render nodes played a huge part in making this possible now - it would certainly not have been possible to the same extent this time last year.

- What is most promising is that the issues in the key findings from this shoot are known issues relating more generally to the nature of all virtual productions using this workflow.

What we did