ZED 1.0 is here, adds Positional Tracking and 3D Reconstruction

We are excited to announce the release of ZED SDK 1.0! This update brings 6-DoF positional tracking to VR and robotics, real-time 3D mapping with ZEDfu, a new Wide VGA resolution and more.

The Road to 1.0

It’s been an exciting year here at Stereolabs. Since the launch of the camera in 2015, the ZED community has been growing far beyond our expectations. The camera has been used in a variety of autonomous cars, drones, and robots, in the visual effects and movie industry and even in retail and sports applications. We received great feedback from thousands of developers all around the world and worked hard to make continuous improvements to the ZED.

In the meantime, our research team here at Stereolabs has been working relentlessly to prepare this groundbreaking update. 1.0 is the result of months of work from many computer vision engineers and scientists. It pushes the boundaries of what’s possible with stereo vision, and makes the ZED the first device in the industry to offer real-time depth sensing, positional tracking and 3D mapping capabilities.

We are excited to see what new applications you will create with the ZED. So read on to learn what’s new in v1.0, and download the latest SDK here.

Positional Tracking for VR and Robotics

ZED 1.0 introduces position and orientation tracking for any device that has a ZED mounted on it. This opens the doors to creating incredible mobile and desktop VR experiences with total freedom of movement. In other fields such as robotics, this feature can be used to help machines keep track of their own location as they move around the space without the use of GPS or high-end IMU.

To perform tracking, the ZED uses a novel depth-based SLAM (Simultaneous Localization and Mapping) technology that was developed from scratch and optimized to run at high speed. The camera maps the three-dimensional world in front of it in real time and understands how the user moves through space. On desktop and laptop GPUs, tracking runs at camera frame-rate, which means you can get up to 100Hz tracking frequency in WVGA mode.

Positional tracking API is available in ZED SDK 1.0 for Windows, Linux, and Jetson. The tracking API also supports Unity, ROS and other third-party libraries. Check out our samples on GitHub and get started.

ZEDfu for Real-time 3D Mapping

Photogrammetry is great, but it is time-consuming and requires a lot of manual steps. To simplify the process of capturing and editing 3D models, we have created ZEDfu, an easy to use 3D scanning application that capture 3D models of a scene in real-time.

Using ZEDfu, users can pick up and move a ZED around to generate a 3D mesh of any indoor and outdoor environment. The resulting mesh can be imported in 3D software like Blender, Maya or Meshlab for further editing and finishing. And for those of you who wonder, yes ZEDfu means ZED fusion!

ZEDfu is available as a standalone application on Windows. To get it, download ZED SDK here.

High-speed WVGA mode

High-speed VGA is a popular choice among ZED users. We have updated ZED firmware (1142) and introduced a new Wide VGA mode @100 FPS that replaces the previous VGA mode. WVGA resolution is increased to 672*376 per image and offers an improved image quality along with a much wider field of view. Launch the ZED Explorer to update your firmware.

Performance improvements and SDK changes

The 1.0 full update list includes the following functionalities and improvements:

  • Positional tracking API: New tracking API lets you localize the camera in space.
  • ZEDfu application for capturing real-time 3D models.
  • Wide VGA mode: The new high-speed mode brings improved image quality and a wider field of view at 100FPS.
  • SVO compression: New SVO lossless compression now saves 66% in file size and processing time.
  • New depth sensing mode with up to 2x faster computation: Updated PERFORMANCE mode to a new faster than real-time stereo matching algorithm with up to 2x speedup on Windows, Linux and Jetson embedded boards.
  • Manual control for camera exposure time: Exposure time can now be configured manually.
  • Unity support.
  • OpenCV 3.1 support.

We’re very excited to share this release with you, and we’re eager to see what applications you will create. Download ZED SDK 1.0 to get started, and don’t forget to send us feedback and updates about your work! And follow us on Twitter for news and updates.

 – The ZED team

  • Hans Christian Digermul

    This looks amazing! Would it be possible to 3d map a cave with this equipment? I`m worried about the light sensitivity. What do you think?

    • Bryan Lyon

      I think you’d be better off with LADAR or ultrasonic sensors. 3d mapping with cameras really rely on good lighting.

    • As Bryan said, the ZED will not be able to map areas without light. So it really depends on the power of your LED headlight.

  • dd

    I’m sorry for asking technical questions here, I don’t see any other appropriate place out there for discussions about problems with ZED SDK and did not receive answers to my questions sent to the support@stereolabs.com. Please take a look into my issue here http://stackoverflow.com/questions/38696684/zed-sdk-crashes-when-used-with-pclintegralimagenormalestimation.

  • Revilo Keb

    Hi, are there plans to make ZEDfu available also on Linux or release the source code such that it could be easily ported to Linux?

    • We plan to add ZEDfu as part of the ZED SDK on Windows, Linux and Tegra so you will be able in your own applications.

      • Rayjan Wilson

        Any updates on adding ZEDfu to the SDK? It’s still not present in v1.2

  • André Silva

    Hi, sorry if this is a dumb question, but I have installed the ZED SDK 1.0.0c on Windows 10 and it was supposed to have the ZEDfu application, right??

  • Curtis Edwards

    Hi ZED,
    First great product, I would like to push the boundary a little more and have a few questions:

    1: can two or more cameras be used to increase position/rotation accuracy ( +-0.5mm +-0.5deg)
    (offset looking same direction or different directions?)

    2: can additional hardware be integrated ( ie: GPS / Inertial, rotation sensors, goal of question 1)

    3: is it still possible to get 50fps HD with two ZEDs ,(goal of question 1 ) what portable hardware could do this (Jetson??) whats the limit of cameras USB3 /processing/CPU/Memory

    4:is there an option to “snapshot” where the camera location/rotation is at a particular time (ie: running time code while recording) used in conjugation with hires cameras so I would need to know when photo was taken. or pulse saved on data stream to know?

    thanks for any info or perhaps where to start this process myself.

    • Curtis Edwards

      Im replying with what I have found out, so that it may help others:

      *yes you can use 2 cameras it will increase stability but not accuracy.
      *better if both ZEDs in same direction.
      *unsure yet if additional hardware can be integrated, perhaps with sdk.
      *yes it is still possible to get 50fps HD with two ZEDs, if 960 or 1070 GTX and dual fast usb 3 (laptop , NOT with Jetson , partly because of usb3).
      *system does timestamp with PC time at the nano second, but would have to be some work to integrate time-code.


      • Nathan Brower

        Thanks for your comment Curtis. Have you personally used the positional tracking?

  • Xiao Li


    I was wondering if it possible to use this for 3D reconstruction of large area, for example, a entire building.


    • You can use ZEDfu to map large areas, however due to memory constraints, we would recommend to reconstruct separate small areas and merge them afterwards in Blender or other mesh processing tools.

    • Matteo Carotta

      That would be great! Somebody did try some implementation for building’s mapping?

  • Russ Cereola

    What is the accuracy of the camera? Can we use it to create models of installed kitchen cabinets for the purpose of creating / designing an accurate custom countertop – export for CAD / CAM.

    • ZEDfu can used to map larger areas but at the moment we don’t recommend using it for fine-detail 3D reconstruction at the object level.

  • Gawie Van Blerk

    Would the ZED camera and SDK be a good match for tracking moving objects? Eg tracking a ball in flight. The camera would be stationary

    • Yes, the ZED in high speed mode (60 or 100FPS) should be able to track a ball in flight.

  • J Castillo

    Is it possible to load a SVO file for localization in Unity. I’m using zedCamera.EnableTracking (position, true, “C:/file/to/.svo”) but it’s not working. Do I have to do something else? Does the file path has to be added differently since I’m inside of Unity?

  • Lucas van Rennes

    Hi seems a great way to scan outdoors. Would it be suitable to scan outdoor grounds and do measurements on trees? Would it be roughly reliable ? What would be the accuracy? Thanks!

  • splintersilk

    Hi! Has anyone managed to get the calculated camera motion data out of ZEDfu as an abc or fbx file?

    • Deepak R

      I have seen a path.ini file in the folder where the 3D mesh.ply is getting created. But this is in windows version I have. Do tell me if you can make any sense out of this file.

      • splintersilk

        Hey there, thanks for the reply Deepak. I don’t recall seeing the path.ini file however based on the name I would guess that this would be a reference to directory/file pathing, as opposed to the camera animation.

  • cyberpiper

    Hello! I supposed many you have used zed.fu to produce 3d models. When using photogrammetry application, the UV mapping, texturing is strange, all in little bits and pieces. I was wondering how the uv maps look like for a model made with ZEDfu… could anybody load some up here please? Tx in advance!