Introducing ZED Mini
We’re excited to introduce the ZED Mini, our newest depth camera. Featuring visual-inertial motion tracking, improved depth sensing and an all-new camera design, it is also optimized for augmented reality.
At Stereolabs, we believe that stereo vision is the most effective way to bring space understanding to objects around us. Creating simple and widely available software that abstracts away the complexities of 3D computer vision has been the goal of Stereolabs since day one. Today, we have marked an important milestone on our journey to bringing visual perception to objects around us.
We’re announcing the launch of the ZED Mini, our visual-inertial depth camera. With a human-like eye separation of 63 mm, an improved depth sensing technology and new motion sensors, we provide developers, artists and designers with a powerful new tool to create amazing experiences. The camera is now available to order through our store.
High-speed Dual Camera with Improved Depth and Extended Range
Built on the same sensor technology as the ZED, the ZED Mini features dual high-speed 2K image sensors and a 110 degree field of view. With an eye separation of 63 mm, the camera senses depth from 0.1 meters to 12 meters with improved accuracy and fewer occlusions in the near range.
We’re also introducing a new Ultra-Depth mode for the ZED Mini. True HD depth maps can now be extracted at the native resolution of the sensor, offering up to a 2x gain in depth accuracy and range. In Ultra-Depth mode, the range of the camera is increased to 20 meters. ZED users will also benefit from the new Ultra mode by upgrading to the latest SDK update, increasing the depth range of their camera up to 40 meters.
New Motion Tracking Technology
To ensure accurate motion tracking, ZED Mini integrates new gyroscopes and accelerometers. Using next-generation visual-inertial odometry technology, inertial measurements are fused at 500Hz with visual data from the stereo camera. The new technology allows applications such as autonomous navigation and augmented reality to track movements in space with higher accuracy and less drift.
All-New Camera Design with USB Type-C
ZED Mini introduces a beautiful black design with a heat and shock resistant aluminium bezel that maintains sensing quality over time and preserves factory calibration. A new USB Type-C port integrates video output and power in a single connector, delivering up to 3Gb/s of data at the lowest latency. To support high data rates and increase device reliability, we have redesigned the electronics board, improved materials and included additional protection circuits. The external connector allows flexibility in cable selection while improving integration and usability in the field.
Joining the family
ZED Mini works seamlessly with the existing SDK, plugins and samples available for the ZED. From ROS and OpenCV to Unity and Unreal, you can use a wide range of third party libraries with the ZED Mini to build exciting applications ranging from robotics and people analytics to mixed reality.
Turning your room into a spatial display
Augmented reality has tremendous potential, but it’s not without its challenges. Current solutions like Hololens suffer from a limited field of view and display transparent holograms which break the sense of presence and realism. It also requires you to map out the entire area beforehand. Stereolabs is breaking augmented reality free from these limitations and brings true augmented reality to the industry.
Sporting similar high quality dual 2K image sensors as the ZED, the sensors are now separated by 63 mm, reproducing the distance between human eyes. With its high-speed HD video at 60 fps and 110 field of view, the ZED Mini turns VR headsets into high-end stereo video pass-through HMDs.
Besides stereo passthrough, the camera understands the world around you and perceives moving objects in space, enabling a level of interaction and realism never seen before with augmented reality devices.
Optimized for augmented reality
Unlike traditional augmented reality devices, AR in a VR HMD is incredible but unforgiving. We have spent the last three years working on challenges such as latency mitigation, high frame-rate depth and motion sensing, real-time occlusion filling, 3D engine integration and low-level performance optimization. The result is the world’s first stereo video pipeline optimized for VR rendering.
Many challenges had to be overcome. One of them being able to consistently run at a comfortable frame rate, inside an application already heavy on the GPU for rendering, post-processing and display. Positional tracking, depth computation and filtering for the left and right eyes had to be streamlined, rearchitectured and load balanced so that all the pieces fit perfectly together on a single VR PC.
To overcome the latency added by the USB video transmission, we developed a new technology called Video Async Reprojection (VAR). Working in a similar way as Oculus Timewarp, VAR uses the IMU of the camera to reposition video frames in the headset with the correct timing. This reduces the perceived rotational latency and dramatically improves the visual experience.
We spent considerable efforts in building a tight integration between our video+depth pipeline and 3D engines such as Unity and Unreal. Custom low-level modifications were necessary to ensure developers can create stunning mixed reality applications with real-time depth interaction and space understanding. As a result, we released plugins for both Unity and Unreal Engine, available on our Downloads page.
We hope that you will enjoy this completely new visual experience, and take the opportunity to create new amazing applications that blend the real and virtual worlds and alter our sense of reality.