Position Tracking in Unity

How to Add Position and Head Tracking in Unity for VR

Positional tracking is what makes the HTC Vive and Oculus Rift so immersive. By using the ZED as an add-on camera to a VR headset, desktop and mobile HMDs such as Gear VR can now track your position in space and offer great experiences with full freedom of movement.

In this tutorial you will learn how to track the movement of the ZED in Unity so that an identical virtual camera move can be reproduced in the game engine.



Note: To bring ZED image, depth and motion tracking data into Unity, download the new 1.2 plugin here.

ZED Package for Unity

The ZED package for Unity includes assets, scripts, and a sample scene to assist with development. The ZED package contains the following subdirectories:

PrefabContains the main Unity prefab that is used to replace the regular Unity Camera within a scene: ZEDCamera.
ScenesContains a sample scene illustrating the use of ZED motion tracking in Unity: ZEDTrackingScene.
ScriptsContains C# files that are used to interface the ZED and Unity components. The scripts are used by the Prefab.
PluginsContains sl_unitywrapper.dll, which enables C# scripts to communicate with the ZED SDK.

Download ZED Package for Unity

The package to use the ZED with Unity is available for download here.

To get started with virtual reality development in Unity, see Unity VR and Oculus documentation.

Create a new Project

  • Run Unity.
  • Select File > New.
  • Choose a Project name and a path.
  • Make sure 3D is selected and then click Create project.

Import the package

To import the ZED package in Unity, you have to go in AssetsImport packageCustom package.


After selecting the ZEDCamera.unitypackage, make sure all the boxes are checked and then click Import.


You can now launch the ZEDTrackingScene sample. Make sure your ZED is connected, and that its line of sight is not obstructed. Run the demo scene, move your ZED around and watch the virtual camera reproduce the same movement in Unity.

Using ZED Prefab to move the virtual camera

The ZEDCamera prefab is the easiest way to add positional tracking in a virtual environment. To use, simply drag and drop the prefab into your scene.


By adding the ZEDCamera as a father of Unity main camera, you will have direct control of the virtual camera pose by head tracking. The starting position will be located at (0,0,0) in World space.


To change the starting position of the virtual camera, you need to add the ZEDCamera as a child of a parent GameObject. Camera movement will be relative to the reference GameObject absolute coordinates system.


Let’s add a few cubes to our scene. The virtual camera in Unity will now reproduce the same movement as the ZED.


Using ZED Prefab to move any GameObject

You can also move any object in the scene using ZEDCamera. Fix the position of the virtual camera,  and drag a cube object as a child of ZEDCamera. Begin moving around the ZED, and see how the cube reproduces your hand or head movements.

Adding positional tracking with ZEDTracker class

For advanced Unity users, this section explains how to interface Unity with the ZED tracker in C#. Example code is included in ZEDCamera.cs.

Initialize the ZED

Firstly you need to initialize the camera.

  //To get an instance of the camera
  ZED = ZEDTracker.GetInstance();

  //Create the camera

When you initialize the ZED, you can choose which metric will be used.

  //Initialize the ZED and return an error if a problem occurs
  if (e != ZEDTracker.ERRCODE.SUCCESS) {
    throw new ZEDException("Initialization failed " + e.ToString());

Setup the tracker

To use the tracker you need to initialize it with a matrix. By default a 4×4 identity matrix is used.

  //Create an identity matrix 4x4
  path = IdentityMatrix();
  //Enable tracking with the identity matrix
  //The tracking will relocalize itself if lost
  bool tracking = ZED.EnableTracking(path, true);
  if (!tracking) throw new ZEDException("Error, tracking not available");

After initializing the tracking, you need to grab the image in the Update() method.
With ZED.Grab(), you can retrieve a position at any given time using ZED.GetPosition(pos, ZEDTracker.MAT_TRACKING_TYPE.PATH).
It will fill an Rt (rotation and translation) array with the position and orientation of the camera in space.

  //Get the position of the camera as a path
  ZED.GetPosition(path, ZEDTracker.MAT_TRACKING_TYPE.PATH);
  //Fill the Rt array
  for (int i = 0; i < 4; ++i) {
      for (int j = 0; j < 4; ++j){
          Rt[i, j] = path[i * 4 + j];

The Rt transformation matrix represents both the rotation and translation of the ZED.


If you use ZEDTracker.MAT_TRACKING_TYPE.POSE, the orientation and translation values will be the given as a displacement from the previous position, also called “pose”.

  //Get the translation
  Vector4 t_ = Rt.GetColumn(3);
  Vector3 translation = new Vector3(t_.x, t_.y, t_.z);

  //Get the rotation as a Unity compatible quaternion
  Quaternion rotation = ZEDTracker.QuaternionFromMatrix(Rt);

Read More


ZED 1.0 is here, adds Positional Tracking and 3D Reconstruction

We are excited to announce the release of ZED SDK 1.0! This update brings 6-DoF positional tracking to VR and robotics, real-time 3D mapping with ZEDfu, a new Wide VGA resolution and more.

The Road to 1.0

It’s been an exciting year here at Stereolabs. Since the launch of the camera in 2015, the ZED community has been growing far beyond our expectations. The camera has been used in a variety of autonomous cars, drones, and robots, in the visual effects and movie industry and even in retail and sports applications. We received great feedback from thousands of developers all around the world and worked hard to make continuous improvements to the ZED.

In the meantime, our research team here at Stereolabs has been working relentlessly to prepare this groundbreaking update. 1.0 is the result of months of work from many computer vision engineers and scientists. It pushes the boundaries of what’s possible with stereo vision, and makes the ZED the first device in the industry to offer real-time depth sensing, positional tracking and 3D mapping capabilities.

We are excited to see what new applications you will create with the ZED. So read on to learn what’s new in v1.0, and download the latest SDK here.

Positional Tracking for VR and Robotics

ZED 1.0 introduces position and orientation tracking for any device that has a ZED mounted on it. This opens the doors to creating incredible mobile and desktop VR experiences with total freedom of movement. In other fields such as robotics, this feature can be used to help machines keep track of their own location as they move around the space without the use of GPS or high-end IMU.

To perform tracking, the ZED uses a novel depth-based SLAM (Simultaneous Localization and Mapping) technology that was developed from scratch and optimized to run at high speed. The camera maps the three-dimensional world in front of it in real time and understands how the user moves through space. On desktop and laptop GPUs, tracking runs at camera frame-rate, which means you can get up to 100Hz tracking frequency in WVGA mode.

Positional tracking API is available in ZED SDK 1.0 for Windows, Linux, and Jetson. The tracking API also supports Unity, ROS and other third-party libraries. Check out our samples on GitHub and get started.

ZEDfu for Real-time 3D Mapping

Photogrammetry is great, but it is time-consuming and requires a lot of manual steps. To simplify the process of capturing and editing 3D models, we have created ZEDfu, an easy to use 3D scanning application that capture 3D models of a scene in real-time.

Using ZEDfu, users can pick up and move a ZED around to generate a 3D mesh of any indoor and outdoor environment. The resulting mesh can be imported in 3D software like Blender, Maya or Meshlab for further editing and finishing. And for those of you who wonder, yes ZEDfu means ZED fusion!

ZEDfu is available as a standalone application on Windows. To get it, download ZED SDK here.

High-speed WVGA mode

High-speed VGA is a popular choice among ZED users. We have updated ZED firmware (1142) and introduced a new Wide VGA mode @100 FPS that replaces the previous VGA mode. WVGA resolution is increased to 672*376 per image and offers an improved image quality along with a much wider field of view. Launch the ZED Explorer to update your firmware.

Performance improvements and SDK changes

The 1.0 full update list includes the following functionalities and improvements:

  • Positional tracking API: New tracking API lets you localize the camera in space.
  • ZEDfu application for capturing real-time 3D models.
  • Wide VGA mode: The new high-speed mode brings improved image quality and a wider field of view at 100FPS.
  • SVO compression: New SVO lossless compression now saves 66% in file size and processing time.
  • New depth sensing mode with up to 2x faster computation: Updated PERFORMANCE mode to a new faster than real-time stereo matching algorithm with up to 2x speedup on Windows, Linux and Jetson embedded boards.
  • Manual control for camera exposure time: Exposure time can now be configured manually.
  • Unity support.
  • OpenCV 3.1 support.

We’re very excited to share this release with you, and we’re eager to see what applications you will create. Download ZED SDK 1.0 to get started, and don’t forget to send us feedback and updates about your work! And follow us on Twitter for news and updates.

 – The ZED team

Read More


ZED SDK 0.9.3 Brings Performance Improvements and More

We are excited to announce the release of ZED SDK version 0.9.3! This new update of the ZED SDK brings significant performance and usability improvements, along with support for our new factory calibration technology.

The 0.9.3 release includes the following improvements:

  • Significant performance improvement on PC, Tegra K1 and X1 mobile/embedded system-on-a-chip
  • General usability improvements, including a background process for self-calibration
  • Support for CUDA 7.5
  • Support for ZED new factory calibration technology
  • Improved depth map filtering in saturated areas
  • Many improvements to the ZED tools and samples

New Architecture and Higher Performance

With the number of ZED users growing significantly in the past few months, we’ve been expanding the development team to prepare the release of significant software updates in 2016. We started the year with a brand-new architecture for the ZED SDK that introduces great performance and stability improvements and allows us to push toward shorter release cycles.

The 0.9.3 release brings up to 40% faster depth computation on lower performance graphics chip, including embedded system-on-a-chip such as Tegra K1 or X1. On these platforms, latency has been dramatically reduced when using CPU retrieve functions. These improvements now make the ZED and Tegra one of the most powerful platform for development of embedded computer vision applications.

General Usability Improvements

The new update brings several usability improvements. The initialization of the camera which includes self-calibration is now run as a non-blocking background process. This makes using the ZED much easier as self-calibration does not need to be successful in order to use the camera. If self-calibration is not successful, the SDK will use the default factory calibration.

User interface to record and playback SVO and snapshots has also been improved, and ZED Explorer and Settings App have been merged. The ZED Explorer is now the app to use when you want to adjust your camera controls and calibration settings.

Finally, we’ve also reduced depth flicker in saturated areas by detecting and filtering those areas.

Support for CUDA 7.5

ZED SDK 0.9.3 brings full support for CUDA 7.5 on Windows and Linux, which will simplify the SDK integration in applications using the latest version of CUDA.

New Factory Calibration

Accurate camera calibration is one of the biggest challenges in building a long-range depth sensor. It is very difficult to calibrate a 3D camera with a high accuracy in both the near and far range and in the corners of the picture. At Stereolabs, we’ve been working on improving our calibration process since early 2015, and we now have a brand new stereo calibration process set up at our factory.

This new process brings great improvements in depth accuracy in the far range and in the corners of the image. You can also expect a consistent behavior across all ZED cameras. Cameras shipped after the 1st February 2016 will benefit from this new calibration process. For our other users, we will release a recalibration app with the next update that will let you reach similar results with your ZED.

Download the ZED SDK 0.9.3 Release Today!

ZED SDK 0.9.3 for Windows and Linux is now available for download on our Developer page. Our team is very excited to bring this release as a stepping-stone for the work we have set out for 2016!


Read More

zed camere ROS FRC

ZED for FIRST Robotics competition


Welcome to FIRST Stronghold participants!

At the Kickoff, teams were shown the FIRST STRONGHOLD game field and challenge details for the first time. Now it’s time to design and build the robot that will dominate the field!

Add the power of human vision to your robot











While the game seems to be quite complex this year, you can count on Stereolabs and the ZED camera to help you.

We’ll be offering a special ZED educational price for teams competing in FRC, so that you can add 3D vision to your robot. If you’re interested, send us an email to support@stereolabs.com with a short description of your team and we’ll give you a discount code.

And make sure to check NVIDIA FIRST Robotics dedicated page. They are offering the NVIDIA Tegra X1 for just $299 during the competition.

May the ZED be with you

To help you get up and running, we published sample code on our GitHub page. Also, we’re here to help, so feel free to contact our tech support team if you need anything.

We’re very excited to see what you and your teams will build! Good luck!

– The ZED team

Read More


New ZED SDK 0.9.2 Brings Support for Oculus Rift

We’re excited to announce the release of the ZED SDK 0.9.2 for Windows and Linux. This release adds support for Oculus Rift, substantial performance improvements for NVIDIA Jetson TK1, and much more.

Support for Oculus Rift

As part of 0.9.2, we have introduced support for Oculus Rift DK2. You can now use the ZED to live stream 3D video to Oculus Rift, either for AR / VR applications or 3D FPV. A complete tutorial is available on our blog, along with source code.  Also, don’t forget to regularly check our GitHub page for updates!


Higher Performance with Jetson TK1

0.9.2 includes a brand-new release of the ZED SDK for Jetson TK1. Specifically optimized for the ARM platform, the ZED depth computation engine includes substantial performance improvements. Depth is now 40% faster, so make sure to update to the latest ZED SDK for Jetson.


Better Image Quality

While 0.9 introduced major changes in the SDK architecture, it impacted image quality. In 0.9.2, we have dramatically reduced aliasing artifacts, so if you’re using 0.9, we strongly suggest you upgrade to the new ZED SDK!

New Advanced Calibration Features

We’ve received many requests for advanced customization options from our experts users. In this update, we introduce the possibility to calibrate your ZED and input your own calibration parameters in the Settings control panel. This allows you to have full control over your camera parameters and still benefit from the ZED SDK advanced features.

Windows 10 Compatiblity

Lastly, now that Windows 10 is rolling out, we’ve added full support for the new OS in 0.9.2.


The Road Ahead

If you’re following us on Twitter, you should have seen that we’re working in close collaboration with NVIDIA on its next-generation Jetson TX1 platform. Check out this video to see how we combined the ZED, NVIDIA Jetson TX1 and a DJI Matrice 100 to build the first live scanning drone!

Also, we’re preparing a new release of the ZED SDK for NVDIA Tegra X1, along with very cool things that we will release after CES 2016.

We see more and more developers use the ZED, so don’t forget to send us feedback and updates about your work! And follow us on Twitter for news and updates.

– The ZED team

Read More


New ZED SDK 0.9 Adds Jetson TK1 and ROS Support

ZED SDK 0.9 for Windows and Linux is now available for download on our Developer page. The release is a major update to the ZED SDK, bringing with it support for NVIDIA Jetson TK1, ROS and CUDA 7.

0.9 also includes new tools, samples and documentation that simplify the development of applications with the ZED stereo camera, along with various performance improvements as well as a few fixes.

New Tools and Samples

As part of 0.9, we have introduced a new Depth Viewer tool and new samples including ROS, CUDA, Recording and Playback, along with a complete refactoring of existing samples to make them more accessible and easy to use.

Jetson TK1 Support

nvidiajetsontk1_topThe ZED SDK for Jetson TK1 has been released! While the SDK in still in Alpha version, most of the functionalities offered in the ZED SDK for Linux are available. We are still working closely with NVIDIA to unlock higher resolution and frame rates, which is a challenge considering that the ZED can output on a single USB 3.0 interface 3840*1080 video at 30 FPS!

ROS Compatibility

zedCameraROS2We heard you! We had many requests for ROS support, so we created a wrapper that is now available in the Linux SDK. You will be able to use your ZED either as a stereo camera sending left and right images or as a depth sensor sending both color images and depth.

The wrapper has been designed for the current stable ROS version (Indigo) as a catkin package and should also work with other version of ROS with minimal or no modifications at all.

Read more about the new ROS wrapper here: Using ZED camera with ROS



The new ZED SDK 0.9 for Windows now supports CUDA 7.0. Since CUDA 7.0 no longer supports building 32 bits applications, the support of Win32 is now only available with the CUDA 6.5 version (available here) and is now deprecated.

In future SDK releases, the ZED SDK for Linux will be using CUDA 6.5 in order to maintain interoperability with the Jetson TK1. On Windows, the support of CUDA 6.5 and 32 bits will be dropped.

The Road Ahead

We’re very excited about this update and want to thank you all for your feedback and participation in helping us improve the ZED. But the work doesn’t stop here. We’ve got some really exciting stuff in the works to continue adding to and expanding the use of the ZED and stereo vision around the world.

Thanks so much for all of your support and feedback, and stay tuned!

– The ZED team


Read More


Using the ZED Camera with ROS

The ROS wrapper is an interface between the ZED SDK and the ROS framework. This wrapper lets you access ZED stereo images, depth map, 3D point cloud and 6-DoF motion tracking in the ROS environment.



The wrapper is a catkin package that publish ZED measurements such as depth and odometry on ROS topics. You will need to match the following ROS dependencies:

  • tf2_ros
  • nav_msgs
  • roscpp
  • rosconsole
  • sensor_msgs
  • opencv
  • image_transport
  • dynamic_reconfigure
  • urdf

You also need to install the ZED SDK with CUDA and OpenCV, as described on our Getting Started page.

Build the application

The latest ROS wrapper can be found on our Github page. Download and extract the content of the .zip file. Once extracted, rename and copy the folder zed-ros-wrapper in the catkin workspace source directory ~/catkin_ws/src. If you haven’t created your workspace yet, follow this short tutorial on ROS wiki.

Now you just need to compile the wrapper from your catkin workspace source directory ~/catkin_ws/src.

To do so, open a terminal and execute the following commands:

cd ~/catkin_ws
source ./devel/setup.bash
Launch the application

To run the program, you need to use a launch file which contains different parameters such as camera resolution or depth map mode.

Open a terminal and execute the following command:

roslaunch zed_wrapper zed.launch

The wrapper is now running and the ZED camera outputs are now accessible on the published ROS topics.

ZED ROS topics

A topic is a bus over which data is exchanged or published. For example, you can access ZED left image data on the /zed/left/image_rect_color topic.

Here is the full list of published topics:

Left camera

  • /zed/rgb/image_rect_color : Color rectified image (left RGB image by default).
  • /zed/rgb/image_raw_color : Color unrectified image (left RGB image by default).
  • /zed/rgb/camera_info : Camera calibration data.
  • /zed/left/image_rect_color : Color rectified left image.
  • /zed/left/image_raw_color : Color unrectified left image.
  • /zed/left/camera_info : Left camera calibration data.

Right camera

  • /zed/right/image_rect_color : Color rectified right image.
  • /zed/right/image_raw_color : Color unrectified right image.
  • /zed/right/camera_info : Right camera calibration data.

Depth and point cloud

  • /zed/depth/depth_registered : Depth map image registered on left image (by default 32 bits float, in meters).
  • /zed/point_cloud/cloud_registered : Registered color point cloud.

Visual odometry

  • /zed/odom : Absolute 3D position and orientation relative to zed_initial_frame.

Display images and depth

You can easily display ZED images and depth using rviz or rqt_image_view. Just select a topic in the rqt GUI to display it in the main window.


Display point cloud

To display the point cloud, launch the rviz visualizer with the following command:

rosrun rviz rviz

Now move your mouse to the top left of the rviz window and select zed_initial_frame in Displays->Global Options->Fixed Frame.
Click on add (bottom left), choose the ‘By Topic’ tab, and select point_cloud->cloud->PointCloud2.

ROS zed_wrapper demo

Display odometry

To visualize the odometry in rviz, click on the Add button (bottom left), and select the ZED odom topic under the By topic tab. To display odometry data correctly, make sure to select the newly created Odometry object in the left list, set Position tolerance and Angle Tolerance to 0, and Keep to 1.

Odometry preview

Using multiple ZED with ROS

It is possible to use multiple ZED cameras with ROS. Simply launch zed_multi_cam using the roslaunch command:

roslaunch zed_wrapper zed_multi_cam.launch

Note: The ZED camera uses the full USB 3.0 bandwidth to output video. When using multiple ZED, you may need to reduce camera framerate and resolution to avoid corrupted frames (green or purple frames). You can also use multiple GPUs to load-balance computations and improve performance.

For more information, read the ZED ROS wiki page.

Read More


How to Build an App with the ZED SDK


The goal of this tutorial is to show you how to build applications using the ZED SDK.

All our programs are developed in C++, therefore working knowledge of C++ programming and the C++ compilation pipeline is required (command line or IDE) for this tutorial.


First of all, make sure that your environment is set up correctly, as explained here.
Note: On Windows, please use Visual Studio 2013. Visual Studio 2015 is not supported by CUDA 6.5 and 7.0.

Then start coding your first program using the ZED camera. Here is a very simple “Hello World” program, that just initializes the ZED camera in HD resolution:


//standard includes
#include <iostream>
//ZED includes
#include <zed/Camera.hpp>

int main(int argc, char** argv)
      sl::zed::Camera* zed = new sl::zed::Camera(sl::zed::HD1080);
      zed->init(sl::zed::MODE::QUALITY, -1, true);
      std::cout << "Hello World! My ZED works!" << std::endl;
      delete zed;

Let’s see how to compile this!

Cross-platform method using CMake

The compilation method works on every platform supported by the ZED SDK and provides an easy way to generate a project for every major IDE.

You must have CMake 2.4 minimum installed on your system.

1. Folders and files

Create a folder where you want to put your project. Create a sub-folder named src/. In this folder, add the file main.cpp given above.

Then in the folder of the project, create a file named CMakeLists.txt :



if(COMMAND cmake_policy)
	cmake_policy(SET CMP0003 OLD)
	cmake_policy(SET CMP0015 OLD)  
endif(COMMAND cmake_policy)


IF(WIN32) # Windows
	if (CMAKE_CL_64) # 64 bits
	else(CMAKE_CL_64) # 32 bits
		message("32bits compilation is no more available with CUDA7.0")
find_package(CUDA 7.0 REQUIRED)
ELSE() # Linux
find_package(ZED REQUIRED)
find_package(CUDA 6.5 REQUIRED)
find_package(OpenCV 2.4 COMPONENTS core highgui imgproc REQUIRED)




add_definitions(-std=c++0x) # -m64)

                        ${CUDA_LIBRARIES} ${CUDA_npps_LIBRARY} ${CUDA_nppi_LIBRARY}

SET(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -g -O3" ) # Release Perf mode 

This file is used by CMake as a description of the project to generate. Using that file, CMake is able to generate the following files:

  • – a Makefile;
  • – an IDE project file;
  • – or both (for IDE supporting Makefile)

Note: The folder containing the cpp source file is given in the CMakeLists.txt file, in our case ‘src’. The standard cpp project compiled with CMake typically contain a ‘src’ and ‘include’ folder for the source code and the header files. The CMakeLists is often in the root folder of the project and the binary generation is done in a build folder. This is for good-practice and clarity but nothing prevent you from putting everything in the same folder, as long as you update the source folder in the CMakeLists.txt

To get more information about how to write a CMakeLists.txt file, check the last section of this tutorial.

2. Generate project on Windows

Open cmake-gui.

In “Where is the source code“, enter the path of your project folder, ie where the CMakeLists.txt file is.

In “Where to build the binaries“, enter the previous path and add: /build.

Click the “Configure” button.

A dialog window asks you if CMake can create the folder “build” itself. Say yes.

Then another dialog window will ask you the generator of your project. Choose Visual Studio. For example, we choose “Visual Studio 12 2013 WIN64“. Click the “Finish” button.

CMake may take few seconds to configure the project. Then, some red lines should be displayed in the cmake-gui window.

Click the “Generate” button.

CMake has just generated your project in the build folder. Now, you can close the cmake-gui window and go to the build folder.

Visual Studio files has been generated and a Visual Studio project file named “Project.sln” too. Open it with Visual Studio. You can now edit and compile your program with the Visual Studio IDE.

3. Generate project on Linux

Open your terminal, and move your prompt to your project’s folder.

$ cd path/to/your/project/folder

Create a build folder and change directory.

$ mkdir build
$ cd build

Ask CMake to generate your project with the location of the CMakeLists.txt (one folder above).

$ cmake ..

You should get the following outputs:

-- The C compiler identification is GNU 4.8.4
-- The CXX compiler identification is GNU 4.8.4
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Found CUDA: /usr/local/cuda (found suitable exact version "6.5")
-- Configuring done
-- Generating done
-- Build files have been written to: /path/to/your/project/folder/build

And if you list the folder’s content, you must have the following output:

$ ls
CMakeCache.txt  CMakeFiles  cmake_install.cmake Makefile

Now, you can edit and change your code as you wish. Then to compile your program, enter the following command:

$ make

Finally to test the program :


You should get something like:

ZED SDK >> (Init) Best GPU Found : GeForce GTX 770, ID : 0 
ZED SDK >> (Init) Quality mode has been set to Quality mode
ZED SDK >> (Init) Creating ZED GPU mem...
ZED SDK >> (Init) Trying self calibration... 
ZED SDK >> (Init) Done...
Hello World! My ZED works!

4. CMakeLists.txt : writing your own cmake program

Let’s break into the CMakeLists.txt to see how it works.

Setting the project attributes

At the beginning of the CMakeList.txt file, you can find the following lines :

if(COMMAND cmake_policy)
    cmake_policy(SET CMP0003 OLD)
    cmake_policy(SET CMP0015 OLD)
endif(COMMAND cmake_policy)

The project is named ZED_PROJECT and the minimum required version of CMAKE is 2.4.

The path where the executable of your program must be generated is then set. Using this command:


Setting the path of dependencies

Now, the dependencies of your program must be set. But you have to set the specific location for each system (Windows and Linux). Here, you need ZED libraries and ZED headers but also CUDA and OpenCV headers.

Please note that on Windows, the ZED SDK will specify the environment path during installation of the SDK. Therefore, you can call $ENV(…) with the tags ZED_INCLUDE_DIRS, ZED_LIBRARIES_32/64 to get the include and libs path of the ZED SDK.

On Linux, we also provide a zed-config.cmake to automatically find the include and libs folder of the ZED SDK through a find_package(…).

IF(WIN32) # Windows
	if (CMAKE_CL_64) # 64 bits
	else(CMAKE_CL_64) # 32 bits
		message("32bits compilation is no more available with CUDA7.0")
find_package(CUDA 7.0 REQUIRED)
ELSE() # Linux
find_package(ZED REQUIRED)
find_package(CUDA 6.5 REQUIRED)

find_package(OpenCV 2.4 COMPONENTS core highgui imgproc REQUIRED)

Linking the dependencies to CMAKE

Now that your environment contains the path, you must tell CMAKE what will be the include directories (headers) and link directories (libs). Here is how to do this:



Dependencies are linked to CMAKE.

Setting the path of the source code

Now, you must tell CMAKE where to find the source code to compile. In this tutorial, the source code is in the sub-folder src/.


The second line specifies that the SRC_FILES will contain all the cpp file of the directory. The last line specifies which files to add to the executable, in this case we generate only one program with every source file.

Now your CMakeList.txt file is almost set. The next step is the last one ;).

Linking the libraries during compiling process

Indeed, you just have to tell CMAKE to link the libraries during the compiling process.

# Add the required libraries for linking:

ZED libraries use C++11. So you must tell your compiler.


File ready

Now, your project is ready to be generated.

Note: you can add compile tags for better performance :

SET(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -g -O3" ) # Release Perf mode

Read More


Using the ZED Stereo Camera with Matlab


This tutorial will explain how to use the ZED 3D camera with Matlab. You will learn how to capture images from the ZED and adjust camera parameters in Matlab. Note that viewing and manipulating depth map data is not covered in this tutorial.

NOTE : when using directly the ZED camera (as a webcam), the images are not rectified.

For capturing rectified images and depth maps in Matlab with the ZED, please refer to “How to use the ZED SDK with Matlab”


You must have a MathWorks account (free) and Matlab installed on your system. Of course, the ZED SDK has to be installed as well. Even if you’re new to Matlab, this tutorial should be easy to follow.

Getting started

Open Matlab and in the Command Window, enter this command:

>> webcamlist

You should encounter an error if you’ve never used Matlab with a webcam before:

Error using webcamlist (line 20)

MATLAB Support Package for Webcams has not been installed. Open Support Package Installer to install the Webcam Support Package.

If you’ve encountered the error, click the link in the error log. A dialog window appears:


Click Next > and Log In. Matlab will ask for your MathWorks account credential. Log in.

Then, read and accept the MathWorks auxiliary software license agreement. Click Next > and Install. This might take a while (around 10-15 seconds).

When the setup process is finished, uncheck the “Show support package examples” check box and click on Finish.

Now, in the Matlab Command Window, re-enter:

>> webcamlist

and that should give you:

ans ='ZED'

Perfect! The ZED camera is properly detected. 🙂

Now grab an instance of the ZED camera with the following command line:

>> cam = webcam

and the cam variable will return the current parameters of the camera:

cam =

webcam with properties:

Name: 'ZED'
Resolution: '2560x720'
AvailableResolutions: {1x4 cell}
WhiteBalanceMode: 'auto'
Sharpness: 4
Saturation: 5
Hue: 0
Gain: 4
WhiteBalance: 4600
Contrast: 4
Brightness: 4
Exposure: 2
ExposureMode: 'auto'

Note that you can check whether the ZED camera works properly by using this command:

>> preview(cam)

You can now grab ZED frames with:

>> img = snapshot(cam);

You’ve just grabbed your first ZED frame in Matlab. If you need real-time video capture, set up a loop. Fairly straightforward, isn’t it ? 😛

The ZED Camera will stay active as long as you keep the cam variable in Matlab’s environment. To turn off the ZED camera, use this command:

>> clear cam

Here is a snippet that demonstrate how to open the ZED , grab the side by side images and split them :

clear all;close all;clc;
% get access to the ZED camera
zed = webcam('ZED')
% set the desired resolution
zed.Resolution = zed.AvailableResolutions{1};
% get the image size
[height width channels] = size(snapshot(zed))
% Create Figure and wait for keyboard interruption to quit
f = figure('keypressfcn','close','windowstyle','modal');
ok = 1;
% loop over frames
while ok
    %capture the current image
    img = snapshot(zed);
    % split the side by side image image into two images
    im_Left = img(:, 1 : width/2, :);
    im_Right = img(:, width/2 +1: width, :);
    % display the left and right images
    title('Image Left');
    title('Image Right');
    drawnow; %this checks for interrupts
    ok = ishandle(f); %does the figure still exist
% close the camera instance
clear cam


Read More