Position Tracking in Unity

How to Add Position and Head Tracking in Unity for VR

Positional tracking is what makes the HTC Vive and Oculus Rift so immersive. By using the ZED as an add-on camera to a VR headset, desktop and mobile HMDs such as Gear VR can now track your position in space and offer great experiences with full freedom of movement.

In this tutorial you will learn how to track the movement of the ZED in Unity so that an identical virtual camera move can be reproduced in the game engine.



Note: To bring ZED image, depth and motion tracking data into Unity, download the new 1.2 plugin here.

ZED Package for Unity

The ZED package for Unity includes assets, scripts, and a sample scene to assist with development. The ZED package contains the following subdirectories:

PrefabContains the main Unity prefab that is used to replace the regular Unity Camera within a scene: ZEDCamera.
ScenesContains a sample scene illustrating the use of ZED motion tracking in Unity: ZEDTrackingScene.
ScriptsContains C# files that are used to interface the ZED and Unity components. The scripts are used by the Prefab.
PluginsContains sl_unitywrapper.dll, which enables C# scripts to communicate with the ZED SDK.

Download ZED Package for Unity

The package to use the ZED with Unity is available for download here.

To get started with virtual reality development in Unity, see Unity VR and Oculus documentation.

Create a new Project

  • Run Unity.
  • Select File > New.
  • Choose a Project name and a path.
  • Make sure 3D is selected and then click Create project.

Import the package

To import the ZED package in Unity, you have to go in AssetsImport packageCustom package.


After selecting the ZEDCamera.unitypackage, make sure all the boxes are checked and then click Import.


You can now launch the ZEDTrackingScene sample. Make sure your ZED is connected, and that its line of sight is not obstructed. Run the demo scene, move your ZED around and watch the virtual camera reproduce the same movement in Unity.

Using ZED Prefab to move the virtual camera

The ZEDCamera prefab is the easiest way to add positional tracking in a virtual environment. To use, simply drag and drop the prefab into your scene.


By adding the ZEDCamera as a father of Unity main camera, you will have direct control of the virtual camera pose by head tracking. The starting position will be located at (0,0,0) in World space.


To change the starting position of the virtual camera, you need to add the ZEDCamera as a child of a parent GameObject. Camera movement will be relative to the reference GameObject absolute coordinates system.


Let’s add a few cubes to our scene. The virtual camera in Unity will now reproduce the same movement as the ZED.


Using ZED Prefab to move any GameObject

You can also move any object in the scene using ZEDCamera. Fix the position of the virtual camera,  and drag a cube object as a child of ZEDCamera. Begin moving around the ZED, and see how the cube reproduces your hand or head movements.

Adding positional tracking with ZEDTracker class

For advanced Unity users, this section explains how to interface Unity with the ZED tracker in C#. Example code is included in ZEDCamera.cs.

Initialize the ZED

Firstly you need to initialize the camera.

  //To get an instance of the camera
  ZED = ZEDTracker.GetInstance();

  //Create the camera

When you initialize the ZED, you can choose which metric will be used.

  //Initialize the ZED and return an error if a problem occurs
  if (e != ZEDTracker.ERRCODE.SUCCESS) {
    throw new ZEDException("Initialization failed " + e.ToString());

Setup the tracker

To use the tracker you need to initialize it with a matrix. By default a 4×4 identity matrix is used.

  //Create an identity matrix 4x4
  path = IdentityMatrix();
  //Enable tracking with the identity matrix
  //The tracking will relocalize itself if lost
  bool tracking = ZED.EnableTracking(path, true);
  if (!tracking) throw new ZEDException("Error, tracking not available");

After initializing the tracking, you need to grab the image in the Update() method.
With ZED.Grab(), you can retrieve a position at any given time using ZED.GetPosition(pos, ZEDTracker.MAT_TRACKING_TYPE.PATH).
It will fill an Rt (rotation and translation) array with the position and orientation of the camera in space.

  //Get the position of the camera as a path
  ZED.GetPosition(path, ZEDTracker.MAT_TRACKING_TYPE.PATH);
  //Fill the Rt array
  for (int i = 0; i < 4; ++i) {
      for (int j = 0; j < 4; ++j){
          Rt[i, j] = path[i * 4 + j];

The Rt transformation matrix represents both the rotation and translation of the ZED.


If you use ZEDTracker.MAT_TRACKING_TYPE.POSE, the orientation and translation values will be the given as a displacement from the previous position, also called “pose”.

  //Get the translation
  Vector4 t_ = Rt.GetColumn(3);
  Vector3 translation = new Vector3(t_.x, t_.y, t_.z);

  //Get the rotation as a Unity compatible quaternion
  Quaternion rotation = ZEDTracker.QuaternionFromMatrix(Rt);
  • arul@digitalmagic.in

    This is very good. I am looking for similar solution to be used in Motionbuilder. Has anyone already done it?

  • Digital Lair

    I’m interested in the ZED Stereo Camera tracking motion the other way around. Forward sensing I think it’s called. I would like to use the camera to scan my body movements live and translate them back to the Unity environment to make the model in the environment move just like I am moving. Is this possible? Thanks so much in advance.

    • Have you had any luck with scanning body movements? I’m interested in doing similar.

      • Digital Lair

        Hi Jeff — I never did get a ZED camera because I had already purchased the Intel RealSense by the time I wrote that. I may re-think it though because the RealSense not only has some pretty serious USB problems with Windows environments, but the range seems to be pretty short as well. The ZED claims like 20 meters, right? (i’ll believe that when I see it Anyway, the answer is no, not really. I’ve been working on a off with scanning into Unity but I’m no developer! I actually need help and was on here tonight looking for ways to ask for it without paying huge sums of money. 😉 What are you working on or thinking about working on?

  • John Liao

    Can ZED works with Galaxy VR instead of PC?

  • The ZED can be connected to a PC or an Nvidia Jetson compact board. We started with Unity integration, but additional integrations will be coming soon.

  • Hello,
    Why I am not unable to get the depth from ZED as a Render Texture in Unity?
    Why there is no documentation how to use fully ZED with Unity?
    Do I have to write my own C++ wrapper for every function for ZED?
    I bought the ZED because I thought that it has Unity support.

    • Hi Jaroslav,
      Full Unity support is coming next week. You can get access to the beta plugin by writing to our Support team at support@stereolabs.com.

    • Full Unity support for the ZED is now available. Download the new 1.2 plugin on our Developer page!

  • Jakub Kusik

    Unreal 4?

  • J Castillo

    Is it possible to load a SVO file for localization in Unity. I’m using zedCamera.EnableTracking (position, true, “C:/file/to/.svo”) but it’s not working. Do I have to do something else? Does the file path has to be added differently since I’m inside of Unity?

  • giang phan

    when I run sample on Unity I have error “DllNotFoundException: sl_unitywrapper
    sl.zed.ZEDCamera.Destroy () (at Assets/ZED/Scripts/ZEDCamera.cs:433)
    ZEDManager.OnApplicationQuit () (at Assets/ZED/Scripts/ZEDManager.cs:89)”. Please help me!
    thank you so much!

  • Vincent Rieuf

    Hello, does the positional and rotational tracking dolely rely on optical datat or also on IMU?