Skip to main content

Pushing the limits of technology for filmmakers: allow post-production to start live during shooting!

Discover how Cooke partnered with EZtrack to revolutionize the art of film making in this exclusive behind-the-scenes look at the forthcoming film Comandante narrated by VFX Designer Kevin Tod Haug. This new workflow allows film makers to marry images from on-set cameras and accurately distorted 3D renders in Near Real Time.

THE DAWN OF NRT

A METADATA STORY

WATCH THE BTS VIDEO

Introducing the Near Real Time workflow (NRT)

The NRT for Near Real Time pipeline is a concept being engineered by David STUMP and Kevin HAUG for production of the Comandante movie.

The workflow primarily aims at bringing together key advancements in lens calibration, machine learning and real-time rendering to deliver higher quality composites of what was just shot to the filmmakers, in a matter of minutes.

The NRT pipeline involves several technologies blended together: EZtrack is one of them as our tracking system has been combined to our expertise in the lens metadata.

 

The film Comandante: NRT set-up overview

During filming, all camera metadata and lens information have been recorded from the camera and streamed into Unreal.

The data was then fed into Nuke as the Unreal Reader plugin  enabled the filmmakers to re-render the background in high quality.

The virtual and physical were merged in Nuke’s CopyCat tool, which uses machine learning. By re-rendering using the true recorded values of the tracked camera’s position, the lag between tracking and real-time render is removed!

 

Cooke Optics anamorphic lens distortion

On Comandante’ set, our team reached precise live anamorphic distortion and shading virtual reproduction.

EZtrack has been used as primary on-set data collection device that did ensure all metadata is properly interpreted for the workflow.

Our technical team did especially take the Cooke coefficients and apply them automatically to the image enabling highly accurate distortion maps.

 

/i protocol support for real-time visualization

For Comandante, instead of shooting lens grids, the team deployed Cooke Optics to use the accurate factory lens mapping of its /i Technology system, which embeds distortion and shading parameters into each lens itself.

EZtrack implements the /i protocol in order to communicate with Cooke lens. This communication is performed along a serial interface using the external port of the Cooke lens.

At each frame, on the genlock pulse, EZtrack asks and retrieves all the basic lens information and distortion model parameters: EZtrack stores all lens metadata into our proprietary TCD protocol and send it over network.

 

Real-time compositing to post straightforward

Set-up for positional camera tracking

For the Comandante production shooting, our EZtrack Hub unit has been primarily used as a motion tracking aggregator.

Multiple motion sensors and protocol have been used on this shooting:

Mechanical tracking

  • Dolly track

  • Dolly elevation

  • Pan/tilt Miller head

Freed protocol for a Technocrane

IR Lighthouse sensors

  • Steadicam shot on the water to illustrate specific shots of the sinking rowboat.

Laser sensor

Realtime distance measurement

  • Fusion into the EZtrack Hub

  • Remote control

The primary render engine as Unreal Engine was operating on the truck being itself hundred meters away from the shooting stage.

Wireless iPad was also in use on set, close to the camera to check livre tracking data parameters.

 

How did EZtrack benefit to the Comandante set-vis?

Our EZtrack Hub aggregates data on set from various sources.

On Comandante project, our team did collect and merge tracking positional data from different kind of sensors and protocols, like mechanical crane and infrared lasers or rotary encoders.

EZtrack always knows exactly all the movements of the camera at a precise timecode, whereas the system is now going a step further by aggregating Cooke Lens metadata!

Giving directors the confidence a shot works, in moments!

EZtrack & Cooke R&D partnership unlocks for the first time the use of accurate optical information for live and post-production extension set!