Using the ZED Camera with ROS

Note: This is for ZED SDK 1.2 only. Please see the latest SDK guide for ROS here.

The ROS wrapper is an interface between the ZED SDK and the ROS framework. This wrapper lets you access ZED stereo images, depth map, 3D point cloud and 6-DoF motion tracking in the ROS environment.



The wrapper is a catkin package that publish ZED measurements such as depth and odometry on ROS topics. You will need to match the following ROS dependencies:

  • tf2_ros
  • nav_msgs
  • roscpp
  • rosconsole
  • sensor_msgs
  • opencv
  • image_transport
  • dynamic_reconfigure
  • urdf

You also need to install the ZED SDK with CUDA and OpenCV, as described on our Getting Started page.

Build the application

The latest ROS wrapper can be found on our Github page. Download and extract the content of the .zip file. Once extracted, rename and copy the folder zed-ros-wrapper in the catkin workspace source directory ~/catkin_ws/src. If you haven’t created your workspace yet, follow this short tutorial on ROS wiki.

Now you just need to compile the wrapper from your catkin workspace source directory ~/catkin_ws/src.

To do so, open a terminal and execute the following commands:

cd ~/catkin_ws
source ./devel/setup.bash
Launch the application

To run the program, you need to use a launch file which contains different parameters such as camera resolution or depth map mode.

Open a terminal and execute the following command:

roslaunch zed_wrapper zed.launch

The wrapper is now running and the ZED camera outputs are now accessible on the published ROS topics.

ZED ROS topics

A topic is a bus over which data is exchanged or published. For example, you can access ZED left image data on the /zed/left/image_rect_color topic.

Here is the full list of published topics:

Left camera

  • /zed/rgb/image_rect_color : Color rectified image (left RGB image by default).
  • /zed/rgb/image_raw_color : Color unrectified image (left RGB image by default).
  • /zed/rgb/camera_info : Camera calibration data.
  • /zed/left/image_rect_color : Color rectified left image.
  • /zed/left/image_raw_color : Color unrectified left image.
  • /zed/left/camera_info : Left camera calibration data.

Right camera

  • /zed/right/image_rect_color : Color rectified right image.
  • /zed/right/image_raw_color : Color unrectified right image.
  • /zed/right/camera_info : Right camera calibration data.

Depth and point cloud

  • /zed/depth/depth_registered : Depth map image registered on left image (by default 32 bits float, in meters).
  • /zed/point_cloud/cloud_registered : Registered color point cloud.

Visual odometry

  • /zed/odom : Absolute 3D position and orientation relative to zed_initial_frame.

Display images and depth

You can easily display ZED images and depth using rviz or rqt_image_view. Just select a topic in the rqt GUI to display it in the main window.


Display point cloud

To display the point cloud, launch the rviz visualizer with the following command:

rosrun rviz rviz

Now move your mouse to the top left of the rviz window and select zed_initial_frame in Displays->Global Options->Fixed Frame.
Click on add (bottom left), choose the ‘By Topic’ tab, and select point_cloud->cloud->PointCloud2.

ROS zed_wrapper demo

Display odometry

To visualize the odometry in rviz, click on the Add button (bottom left), and select the ZED odom topic under the By topic tab. To display odometry data correctly, make sure to select the newly created Odometry object in the left list, set Position tolerance and Angle Tolerance to 0, and Keep to 1.

Odometry preview

Using multiple ZED with ROS

It is possible to use multiple ZED cameras with ROS. Simply launch zed_multi_cam using the roslaunch command:

roslaunch zed_wrapper zed_multi_cam.launch

Note: The ZED camera uses the full USB 3.0 bandwidth to output video. When using multiple ZED, you may need to reduce camera framerate and resolution to avoid corrupted frames (green or purple frames). You can also use multiple GPUs to load-balance computations and improve performance.

For more information, read the ZED ROS wiki page.

  • Myzhar

    A fast question: the images published with “zed_depth.launch” are really “raw” or are you publishing the rectified images?

    Thank you

    • Hey Myzhar, the images published are rectified.

      • Myzhar

        The question is old… it was related to the same question I wrote on Github according to the name of the topics ?

  • Brad Bazemore

    When will there be a ROS driver or some way to get the ZED’s data into a point cloud/PCL?


    hey installation of new sdk installation is stuck there . Internet connection window not responsive .This wasnot a problem in last sdk.

    • きりころ

      Same problem. I downloaded ZED SDK for JetsonTK1 v0.9.1-alpha.

  • Hi,
    This problem has been solved in v0.9.2 available on the developer section.

    ** ZED Team **


    I installed v 0.92 there is no file named zed_wrapper in source file


    hey the nodes are not running .Check the error.

    • Hi,

      Just to let you know that the ZED ROS wrapper is now only on the github, and not with the installer anymore.

      Regarding the “No GPU Compatible” error, can you activate the verbose mode (third boolean parameter) in the ZED Init() function :

      The first verbose line will tell you which GPU is used. It should be K20M

      Do you have CUDA Driver and Toolkit installed properly (through the NVIDIA JetPack…) ?

      ** ZED Team **


        last version of sdk 0.6 used to work perfectly with jetson ,This is sdk is creating problems. ZED is not running with zetson. We are getting errors while executing 3d visualizer .


        hii the error still exists .I have cuda 6.5 installed on jetson. it has been flashed properly i guess.I have chage init to performance mode . Its giving same error after statement trying to self calibrate.It runs fine without ros wrapper on jetson tk1. please check replay asap.
        Can you give u git link.

  • Dr. Kyriakos Deliparaschos

    Any chance to drop resolution below VGA mode, say 320×240 in order to increase FPS? Right now I manage 80 FPS at VGA mode with depth sensing mode set to RAW and quality set to PERFORMANCE.


    Hii guys i am facing problem on running the ros driver on the jetson kit .I am getting a runtime error please help asap.It works fine in linux pc.

  • Richard Nunziata

    I should be able to get a ros point cloud2 by running zed-ros-wrapper depth launch and depth-image-proc/xyzrgb node. However this seems not to work. I’ve open and issue against zed-ros-wrapper even though both the color image and depth image look ok in rviz. You can not get a point cloud.

  • 우종민

    i met this problem.
    i installed cuda 6.5 and 7.5 but zed_wrapper can’t find shared object.
    -> i installed cuda by cuda homepage’s deb(network) file

    how can i solve this problem? i didn’t edit any settings.

  • Mercedes


    We are using ros + ZED in a Jetson tk1. If we run the example “ZED with openCV” the image looks well (first image), but if we use ros the image is wrong, like you can see with image_view (second image). Can you help us?


    • Joshua

      Hi @disqus_ZupuY4Rc6r:disqus! I have the same problem as you did… (just not using Jetson) Did you find out what the problem was ? Would love an explanation

  • Joshua

    Hi guys!
    I have a simple question, it is just not clear in my head yet: Is the self-calibration (i.e. “(Init) Starting Self-Calibration in background…”) a calibration of the intrinsic or extrinsic parameters of the camera?
    I understand that the “ZED calibration tool” is to set the intrinsic parameters, but not sure about self-calibration…

  • Joshua

    I need help please : I don’t understand the utility of the ” ZED Calibration Tool”, as the parameters of the camera are always modified, when launching ZED.launch, with the self-calibration process?
    Is there any way I could over pass this self-calibration and try my own settings?

  • Joshua

    Hi guys,

    I am working on a project using a ZED stereo camera with viso2_ros to be able to visualize and use the odometry. I use viso2_ros for the odometry, and run it with the zed_wrapper_node. When I visualize the odometry message (with rviz), the rotation of the camera is detected (though a bit slow), but no translation is detected/shown…

    I thought the problem might come from the fact two odometries are interfering (ZED one, and viso2 one), but the problem remains when I disable zed odometry (in the zed_wrapper_node)! Plus I tried using new and old version of the node!

    Anyone has had an odometry translation problem before?

    Maybe I’m not disabling the zed odometry correctly?

  • cn

    Hi Everyone!

    There was no specification on accuracy of the device so I decided to buy one to try it out .
    The pointcloud that I got from the camera looks very off . So can anyone share a real sample pcd/ply file so I can baseline if it is a problem with my camera ? Do you have to calibrate it yourself or the factory calibration works ?

  • Dr. Kyriakos Deliparaschos

    Are you planning to support any time soon AMD or Intel/OpenCL GPUs instead of NVidia/CUDA only?

  • Dr. Kyriakos Deliparaschos

    Are you planning to support AMD or Intel GPUs with OpenCL anytime soon?

  • Filippo Muzzini

    Hi! i tried to compile zed-ros-wrapper on TK1 but i had a linking error: “/usr/local/zed/lib/ undefined reference to XInitThreads”. Can you help me?

  • Andrew Smith

    I am working with the DJI sdk for the matrice and it requires the use of ROS indigo and Ubuntu 14.0. Which version of your sdk and ROS wrapper should I use?

  • Josh

    Hi Guys,

    I am trying to build a stereo tracker for a ball in 3D space. (Ideally I should be able to track a moving ball at relatively high speeds along with the corresponding x,y,z coordinates). How feasible is this through the ZED camera and would it be good enough to keep up with fast moving objects?