How to Make Mixed Reality VR Videos in Unity

Part 4: Calibration

Now comes the step of mixed reality where you make sure that the real and virtual cameras are precisely aligned. Traditionally, one calibrates their camera by adjusting the virtual camera until a virtual controller model overlaps a real one in the output. But the nuance of zoom, FOV and rotation can cause controllers to properly overlap in one corner of the scene but not the other, leading to calibration being one of the most loathed parts of mixed reality.

Thankfully, we’ve released a calibration tool that, with the ZED’s depth, makes calibrating your camera as simple as playing a VR game. It’s still in beta, but is available for download here.

Once downloaded, make sure your controllers/Trackers are on and tracking, start the app and hit “Play” on the first screen. The next screen will ask how you will track the camera.

Calibration program menu

  • Choose None if you are not tracking the ZED or using the ZED’s own tracking.
  • Choose Vive Tracker if, well, you guessed it.
  • Choose Pad if you are using an Oculus Touch controller or a Vive Controller.

Hit Validate. If you chose Pad, it’ll ask you to pull the trigger of the tracked controller. Do so for the one attached to the ZED. Then you’ll be directed to put on your headset. Stand in front of the ZED, then put your headset on.

Once in VR, take a look around. You should see the following:

  • A 2D screen floating in front of you showing video feed from the ZED. Look closely and you’ll see that there are virtual versions of your controllers on your screen as well. They move when you move the real ones, but are likely not lined up with their real counterparts. That’s what you’re here to fix.
  • A small UI floating above one of your controllers, with a list of positions and rotations similar to the one you often see at the top of the inspector. Also, the controller this box is floating above is blue.
  • A 3D “point cloud” of what the ZED sees. It may be behind you. Notice that you’re there, as well!

Gif of point cloud inside app

The first thing you’ll notice about the point cloud is that it’s not lined up with the real world. This may be hard to tell for most of it, but look for your own body to get a sense of how much it’s off by. The scene may be rotated oddly, positioned off-center, or both. Your first job is to roughly line this world up correctly before it gets fine-tuned by the calibration software.

To do this, look at the UI floating above your blue controller. Notice the top box is highlighted. You can tap left or right on your Touch’s joystick or Vive Controller’s touchpad to adjust these settings. Tapping up or down will scroll through them.

Adjusting the rotation may be tricky as you can’t see the real world. But if it’s correct, the 2D screen in front of you should be facing you, perpendicular to the ground. You can use this as a reference for rotation.

Gif of adjusting orientation manually

For position, it’s best to use your own body. It’s easiest to start with the Y value – adjust it until your clone’s head is level with yours. Then, do the Z value, so that s/he is as far away from the camera as you are. Then the X value, which you should slide until his/her hand is lined up with yours.

Gif of adjusting orientation manually

Once you’re satisfied, hold down the trigger on the blue controller until it goes to the next step.

Next, you will align the real and virtual controllers at several different points in space. Once you do, the tool will calculate an offset that will keep the controllers lined up wherever you go.

To do this, look for a blue sphere nearby. Put the blue controller inside the sphere, which will make the sphere turn green. Then pull the trigger.

When you do this, you’ll notice the real-world video feed on the floating screen is paused. However, the virtual controller models are not. Move both controllers so that they line up well with the real ones in your hand. Then pull the trigger.

You’ve just given the tool your first reference point. You likely noticed the environment shift a bit when you did it. You’ll have to repeat the process four more times with spheres located in different places. Each time, the video stream and the environment around you should get progressively more aligned.

Sometimes, after finishing a sphere, the environment may rotate sharply into a bad angle. This means the controller was rotated incorrectly when you pulled the trigger – it likely looked correct but was hard to verify. When this happens, simply press the grip button to remove the last reference point you added and the environment will go back to how it was before.

Gif of undoing a bad reference point

Tip: When you place your controller in a sphere but before you pull the trigger, move the controller so that its rotation is easy to match up with on the screen. This is not so difficult with Touch controllers, but it’s good to position the Vive controllers so that they are facing straight up or perpendicular to the camera. It can also help to hold them in such a way that your hands are occluding as little of them as possible.

Tip: If you’re watching the 2D screen, it can be easy to forget that you’re calibrating in three dimensions when you line up the controllers. Before you finalize a reference point, make sure that the controllers are at an equal distance to the camera – in other words, that they’re the same size on the screen.

Once you’ve finished the last sphere, everything should be well aligned. However, you should double-check the alignment by moving the controller to the far edges of the screen, and repeat this process at different distances from the camera.

If you find the controller loses alignment at any point, you can add another reference point to fix it. Simply pull the trigger, and it’ll freeze and let you line up the controller just like you did with the spheres. You can add as many as you want.

Gif of checking alignment on either sides of screen
The alignment is quite close on the left side, but poor on the right side. He should add another reference point.

When you’re done, hit the menu button. You’ll see much of the environment turn pink. Take off your headset and you should see a screen like this:

Final calibration screen

If so, your configuration has been saved to that computer. You do not need to grab that file – the ZEDOffsetController script, which is already attached to the ZED_GreenScreen prefab, automatically loads this file at runtime.

Go back to Unity and run the scene just to test it. Hold a controller in front of the camera, and you should see that they are lined up perfectly.

Next: Part 5: Controller Delay