How to Make Mixed Reality VR Videos in Unity
Learn how to capture VR content with an external point of view, and mix real-world people and objects with the virtual world.
Part 1: Introduction
From sleek game trailers to Let’s Plays on YouTube, mixed reality is by far the coolest way to show off VR content. By carefully aligning a virtual 3rd-person camera with a real camera and implementing some tricks, millions of people have grasped the full excitement of VR experiences without ever putting on a headset.
It’s exciting enough when you’re using regular cameras. But with a ZED depth camera, you can go even further by adding lighting and shadows from the virtual world onto real objects and making occlusions much more exact. Best of all, it makes setting up mixed reality even easier as our plugin handles lots of the manual work for you.
This tutorial will walk you through how to set up a mixed reality scene in Unity from start to finish, without writing a line of code. That includes importing our Unity plugin, configuring your scene, affixing your ZED to tracked objects or reference points in the real world, calibrating it, configuring your greenscreen, and getting an internet-ready final output for recording or streaming.
Table of Contents
- 1 – Introduction
- 2 – Installing the ZED Plugin
- 3 – Positioning the Camera
- 4 – Calibration
- 5 – Controller Delay
- 6 – Background Subtraction
- 7 – Virtual Lighting
- 8 – Final Output
What you’ll need
Required:
- ZED depth camera (order here) or ZED mini (order here)
- VR-capable, ZED-compatible PC (requires NVIDIA GPU and a USB 3.0 port)
- HTC Vive or Oculus Rift
- ZED SDK and Unity plugin (downloads here)
- Unity 5.6 or greater (download here)
- OBS, an open-source video capture program (download here)
- Access to the game/app’s source code/Unity project
Optional:
- Vive Tracker (purchase from HTC or Amazon) or extra Vive controller
- USB extension cables (see our preferred choices here)
How mixed reality works
Fundamentally, mixed reality works by taking a real-world camera and a virtual, “in-game” camera, and completely synchronizing them in terms of position, rotation, field of view, etc. Then, using whatever data you have on both worlds, you combine the two so that the real and virtual worlds appear as one. To do this, you need to know which objects (real or virtual) are closer to the camera so you know which one to put in front.
Standard mixed reality setups don’t have much data on where things are in the real world, but they do have the exact position of the user’s head (from the headset). So they render three layers in this order: virtual objects behind the user’s head (the background), the real world, and virtual objects in front of the user’s head. It’s a simple but clever heuristic that works well for many situations.

However, this presents certain limitations. First of all, you can’t depict virtual objects as being between your head and another body part in front of the subject, such as their hand. Second, the “real” layer is treated as a flat plane, so adding lighting or shadows from the virtual world wouldn’t look realistic, and as such aren’t typically supported.
The ZED, on the other hand, knows the distance to every pixel in the real world. This allows Unity to treat real people and objects like any other virtual object. That means “per pixel occlusion,” where each real pixel is in front of or behind the corresponding virtual one based on its own depth, rather than the headset’s. It also means applying lights and shadows using the real object’s actual geometry.

Notes before we start
- This is a detailed, step-by-step tutorial. If you’re comfortable implementing mixed reality on your own and just need a quick reference, our documentation is far less verbose.
- “Mixed reality” means different things to different people. In this tutorial, we’re talking about filming someone in front of a greenscreen to show them inside a virtual world.
- We refer to our ZED camera here, but our new camera, the ZED mini, supports all the features listed here and can be used equally well.
- This guide requires adding our plugin to a game’s Unity project, so it doesn’t apply if you have bought a VR game on Steam and want to make a mixed reality video. However, you can still use the ZED as a normal camera in the standard ways of implementing mixed reality, which will let you use our calibration program.
- Oculus has added native ZED integration to the mixed reality part of their SDK. If you are only building for the Oculus Rift, you may consider using their implementation instead. They offer a guide here. This implementation does not include per-pixel occlusion, but does offer similar lighting effects and adds a useful “virtual greenscreen” that our own plugin doesn’t currently implement. It is also currently the only way to use a third controller to track your camera when using a Rift.
- This guide was written based on v2.3 of the ZED SDK and Unity plugins. Small details may change in the future.