Technical Q&A

About EZtrack compatibility with cameras/lenses?

The EZtrack Hub unit works with any kind of cameras and optical lenses.

  • For lenses with a digital zoom/focus (broadcast lenses), EZtrack will readout zoom/focus data directly from the lens thanks to a custom lens cable. The Hub can connect directly to Canon, Fujinon, Cooke and Angénieux lines of encoded lenses.
  • For non-encoded or Prime lenses, we will provide you with external wheeled encoders for zoom/focus data readout, as these are available at additional priced option. (see « Encodacam external encoders » section).

Important notes:

  1. It is necessary to work with cameras having a genlock input. To deliver very accurate tracking, the timing is important, so the synch/genlock function is important, otherwise final AR result won’t be fluid nor qualitative enough!
  2. We recommend opting for a wide-angle lens to get the best AR result from EZtrack.

Is there a compatibility with PTZ cameras?

A PTZ camera is a fixed camera with Pan-Tilt and Zoom movements.
The Panasonic AW UE 100/150 and Sony PTZ cameras can be controlled remotely and are natively tracked so they can send Pan/Tilt/zoom data to the 3D render engine over FreeD.
PTZ cameras can be coupled with EZtrack for an extended use: coupled with both EZtrack and a positional tracker, a PTZ can be placed on a crane or a dolly, becoming a full movable camera!

EZtrack will read the PTZ pan&tilt data and merge them with its own tracking data to send a unified and complete tracking feed to the 3D render engine.

Does EZtrack work with XR studios?

EZtrack works with green screen studios as well as with XR studios. And if you already have some motion capture system like Vicon, OptiTrack or Qualisys already installed in your XR studio, you can combine them with the EZtrack Hub as it will merge both live tracking data and lens data into one FreeD stream to be sent to the render engine.

Does EZtrack work in any studio conditions?

With Lighthouse technology we have 1mm precision in translation and 0.01 degree of precision in rotation. It’s about the same with Antilatency.

Please note that we have a smart filtering option to smooth the data and we also recently added since the 1.6 version of the EZtrack OS a stead detection mode to remove any jitter when the camera is in static mode.

What benefits of the EZtrack Hub X Antilatency combo for camera tracking?

EZtrack® can be combined with Antilatency; virtual production enthusiasts can now experience a unique set of technical features at their fingertips. Our EZtrack “HUB STARTER KIT” especially intends to be interfaced with any custom based Antilatency setup while providing our Premium features over the Hub: lens data process, genlock, unified interface and camera tracking over IP.


This combination is relevant for large set-up where the tracking area exceeds 10x10m and for set-ups with object tracking, ALT tracker being the smallest tracker on the market.• Nodes adaptive filtering settings overhaul

How big is the tracking area that EZtrack® can cover?

By default, with a regular bundle of EZtrack X Lighthouse tracking combo you can track inside a volume of 10×10 meters. In this config, we advise to stay within a range of 8×8 meters to get maximum accuracy and bandwidth of the setup.

If you need to extend your tracking area beyond the 10X10m size, then you can opt for the EZtrack X Antilatency combo: as long as you put enough markers on the ceiling or on your studio floor, then you can extend your tracking area without any limitation in size at all!

About sensors/trackers support & configuration for live camera/object/talent tracking?

Regarding the tracker support either as the Vive or the Antilatency Alt, the typical config is the following:

  • One EZtrack Hub unit per camera to track on-set (as one unit can only read one optical lens at a time)
  • One tracker (Vive or Antilatency Alt) for the camera tracking
  • One tracker (Vive only) for system calibration/recalibration. Not necessary in case of an Antilatency setup.
  • Up to two trackers available for either virtual objects and/or talents tracking over the Vive config. Up to three possible over an Antilatency Alt-based config.

About Free-D and TCD data protocols support?

  • FreeD is a protocol that sends camera positioning data from the camera to a virtual reality production system. It is currently the most extensive protocol on the market and main rendering softwares can read FreeD protocol. By default, EZtrack is sending Free-D data.

  • TCD is EZtrack in-house data protocol!
    Compared with Free-D protocol, it does not send only camera position data but also lens intrinsic data (zoom/focus, field of view, distortion parameters, nodal offset) in the same data package.
    TCD protocol is a new protocol. It is already integrated into Zero Density, Pixotope and Disguise solutions and is being integrated or will be integrated in more and more leading render engines over the time.

Does the camera need to be always visible by all sensors?

Basically, and for the typical Lighthouse/Vive-based tracking, it is recommended to avoid passing in front/close to the camera tracker (to avoid any cut of the signal), but the technology is robust enough to keep tracking precisely even if one or more base station(s) is temporarily covered by a prop for instance.

Can you record the tracking data and align it with recorded clips for post-production? How to proceed?

Indeed, the Timecode port featured on the EZtrack Hub allows to record tracking data in .FBX file format and use it in post-production pipelines (Nuke, Blender, Maya, etc).