Positional Tracking Use Cases

Robotics #

Intelligent cars, drones and robots need to perceive and understand their environment. They must operate in previously unknown and unstructured environments, where reliable external methods for localization, like GPS, might not always be available. This leads to the requirement for both local mapping to perform fast obstacle avoidance as well as global mapping for path planning and navigation.

Motion tracking is crucial to the creation of these local and global maps. It is used as an input to the Spatial Mapping module of the ZED, which creates a 3D mesh of the environment in real-time. This mesh can be used directly for obstacle avoidance or converted into another map representation such as an occupancy map or a navigation mesh for path planning.

To learn more about live 3D mapping, read our section on Spatial Mapping.

Virtual and Augmented Reality #

Camera tracking is a key component of VR and AR applications. Current mobile headsets only offer rotational tracking, while desktop HMD use external sensors to track the position of the user in space.

With inside-out positional tracking it is now possible to offer fully immersive experiences by allowing users to walk in unlimited 3D space. An identical virtual camera will reproduce movement in the real world and render animated elements in a virtual world or augment the live-action video footage captured by the real camera.

For an example of how to use tracking in Unity, see the sample Build Your First MR App.

Visual Effects #

Match moving is primarily used to track the movement of a camera through a shot so that an identical virtual camera move can be reproduced in a 3D animation program. When new animated elements are composited back into the original live-action shot, they will appear in a perfectly matched perspective and therefore appear seamless.

Combining the ZED with professional cinema cameras simplifies 3D match moving in post-production. The tracking information can be transferred to computer graphics software and used to animate virtual cameras and simulated objects. On-set, real-time camera tracking is also possible, allowing elements that will be inserted in post-production to be visualized live on-set.

For an example of simple compositing using the ZED, see the sample on Green Screen Mixed Reality.