Body Tracking

In this tutorial, you will animate 3D avatars based on real-people movements using the ZED SDK AI module.

What is Body Tracking?

The Body Tracking is very similar to the “Classic” Object Detection module but is using another highly-optimized AI model to detect 3D people’s skeleton, expressed as keypoints. It can detect up to 34 keypoints on a single person. For more details about the Body Tracking feature, please take a look at the Body Tracking documentation page.

In Unity, we will use this 3D keypoints information (more precisely the orientation of each keypoint) to animate a 3D humanoid avatar based on a person movement in real-time.

Preparing your ZED 2

Optimize the AI models

The Object Detection/Body Tracking modules use AI models that need to be optimized before their first use. The Unity plugin will launch the optimization if necessary, but will freeze the Editor for the duration of the process. Thus, we advise you to optimize them externally using the ZED Diagnostic tool located in the ZED SDK installation folder (usually Program Files (x86)/ZED SDK/tools).

The process is described here: How can I optimize the ZED SDK AI models manually?

Ensure Floor Plane detection

The Object Detection module can use the floor plane position to make more assumptions. To do so, the floor plane should be visible in the image when starting the module as in this image:

Setting Up the Scene

Basic Settings

  • Create a new scene and delete the Main Camera
  • In the Project window, go to ZED -> Prefabs and drag ZED_Rig_Mono into the Hierarchy
  • Select the new ZED_Rig_Mono in the Hierarchy.

  • In the Motion Tracking section, make sure Estimate Initial Position is checked. This enables floor detection.
  • In the Inspector, set the Resolution to 1080p. This is not required but increases object detection accuracy
  • Set Depth Mode to ULTRA. Also not required.
  • If your camera is fixed and will not move, enable Tracking Is Static to prevent incorrect drift throughout the scene.

Body Tracking Settings

Scroll down to the Object Detection / Body Tracking section.

Change the Object Detection Model to any HUMAN_BODY_XX detection model.

Multiple settings are available:

  • Image Sync: Synchronize the object detection to the image grab.

  • Enable Object Tracking: If enabled, the ZED SDK will track objects between frames, providing more accurate data and giving access to more information, such as velocity.

  • Enable Body Tracking: The fitting process takes the history of each tracked person to deduce all missing keypoints thanks to the human kinematic’s constraint used by the body tracking module. It is also able to extract local rotation between a pair of neighbor bones by solving the inverse kinematic problem.

  • Max Range Defines an upper depth range for detections (in Meters).

  • Person Confidence Threshold: Sets the minimum confidence value for a detected person to be published. Ex: If set to 40, the ZED SDK needs to be at least 40% confident that a detected person exists.

Last is the “Start Object Detection” button. Normally, the Object Detection module doesn’t start when the ZED does because it causes a long delay. This button is one of two ways to start the module. The other is via script, which we’ll be doing here.

Adding Visuals

  • Create a new empty GameObject in the Hierarchy and rename it “BodyTracking Viewer”
  • Add the ZEDSkeletonTrackingViewer component to it
  • In the Projects window, go to ZED -> Examples -> SkeletonTracking -> Prefabs
  • Add the Virtual Canvas and the Virtual View Camera prefabs to your scene.
  • Drag the Virtual View Camera into View Camera of the ZEDSkeletonTrackingViewer in the Inspector window.

  • In the ZED_Rig_Mono prefab, drag the ZEDView target texture to the Target Texture of the Camera_Left.

Note that “Start Object Detection Automatically” is checked by default. This will call the function in ZEDManager to initialize the Object Detection module as soon as the ZED 2 itself is ready.

Import your own 3D avatar

One 3D Avatar is already available in the Skeleton Tracking example Scene but you can also import your own model. First, your 3D model needs to be rigged, otherwise it won’t be possible to animate it.

Then :

  • Import your model in Unity.
  • In the Rig tab, set the Animation Type to Humanoid and Apply.

  • Create a prefab of this model and add an Animator to it.
  • In the ZEDSkeletonTrackingViewer, grab your prefab into Avatar prebab in the inspector.

Run the Scene

After a short initialization period, the app will pause for 10-20 seconds as the object detection module loads.

Once it does, step into the ZED’s view. You should see an Avatar imitating your movements in real-time.