Building Multiplayer AR Experiences with ZED Mini
The ZED Mini turns VR headsets into AR headsets by adding two cameras on the front. Now, developers can add multiplayer interactions to their AR apps, taking the experience to a whole new level.
Multiplayer AR or “Social AR” is the next major step for augmented reality. While companies such as Google recently added similar functionalities on smartphones with Cloud Anchors, the experience on a mobile phone is still limited in terms of immersion and interaction.
Making Multiplayer AR a Reality
Stereolabs’ ZED Mini is breaking augmented reality free from these limitations by enabling collaborative AR experiences on VR headsets such as the HTC Vive. By adding a passthrough stereo camera on top of a VR headset, developers can build realistic, shared AR experiences.
Building the First Multiplayer Sports AR Experience
Remember how popular the Wii Sports games were? At the time, it introduced a whole new way to play games . When we started working on multiplayer AR at Stereolabs, we wanted to create an experience combining the fun of shared sports games with the immersive nature of VR.
This is how we ended up creating Ping Pong AR, a demo made with ZED Mini and Unreal Engine that allows users to play table tennis with a friend over a realistic-looking table… that doesn’t actually exist.
Building multiplayer AR on VR platforms has required our team to solve several technical challenges.
To synchronize the position of the two players, we used a pair of SteamVR Lighthouses and shared calibration data over the local network. With the Lighthouses serving as external reference point, we could track the position of each player at runtime. We managed to make the add-on cameras sync their understanding of the physical space and share the position of virtual objects in real-time, with all movements, interactions, and effects seamlessly replicated between the devices.
Playing tennis table without depth occlusion would not work as hands and arms would be hidden by the virtual table while playing. Occlusion culling at the rendering frame rate (90FPS) was required to disable rendering of virtual objects when they are obscured by the players’ arms. We used ZED Mini real-time depth maps to detect dynamic occlusions, remove the parts of the table that were occluded and realistically merge the players’ arms with the virtual table.
In ping pong, the ball travels between players in a fraction of a second. Each player has to react quickly and latency becomes a huge issue. In AR passthrough, the first source of latency comes from stereo capture, depth estimation, and video augmentation.
For the project, we developed a video pipeline that reduces motion-to-photon latency below 60ms. We modified the source of UE4 to integrate deeply into the engine and eliminate additional capture and render latency sources. The remaining rotational latency is corrected through our Video Async Reprojection technology, which is conceptually similar to Oculus Timewarp but specific for stereo video camera. With this technique, we finally had a game that was comfortable and exciting to play!
To show this multiplayer AR experience to the world, we wanted to capture an external view of the playground. Here we have simply combined a ZED with a Vive tracker, allowed a third user besides the two players to connect to the game as a Spectator, and to join the game with the ZED.
After a quick calibration step (using our semi-automated calibration tool for green screen capture), we managed to get this live external view of the scene:
Get Started with Collaborative AR!
Building this multiplayer AR demo has required lots of time and effort from a great team of researchers and engineers. We’re very excited to share this playable demo and hope you will enjoy the experience!
We have also released the UE4 source code of the demo on GitHub, so you can start building your own multi-user AR experience.