How to Make Mixed Reality VR Videos in Unity
Part 3: Positioning the Camera
As mixed reality relies on the real and virtual objects being synchronized, you need to position the real-world camera in such a way that its position and rotation are always known.
There are three ways to do this:
- Fixed Position: The ZED remains fixed in one place. You measure where this is in relation to the VR world, place the virtual camera there, and never move either. This is the simplest method, but the least flexible.
- ZED Positional Tracking: The ZED starts in the same place each time, but uses its own inside-out tracking to keep the virtual camera in sync with the real one. This gives more flexibility, but requires you to return the camera to the exact same place each time you start the scene, which can be difficult.
- Vive Tracker or Controller: The ZED is attached to an object tracked by Oculus’ Constellation or OpenVR’s Lighthouse systems, such as a Vive Tracker. This requires either extra hardware or sacrificing one of your existing controllers, and a more involved setup. But once configured, it’s the most flexible and reliable solution.
Note: This section will help you keep the real and virtual cameras in sync, but they won’t be perfectly aligned until you calibrate them in the next section. Keep this in mind as you run tests in this section. It may be best to use the Scene view, rather than the Game view, to validate your work.
Note: Oculus does not currently support using a third controller as a separate tracked object, except for use in its own integrated mixed reality system. If you are using the Oculus Rift for this tutorial, you will have to use the first two options or sacrifice one of your standard two controllers to mount to the ZED.
This is the easiest to set up, but only works if you don’t plan on moving the camera at all once the scene starts. All you need to do is get the right position and rotation of the ZED relative to the scene, and leave it there.
The simplest way is to use a controller. As you only need to find the position once, you don’t need to have a third controller to make this solution work. Simply run the game in the editor, and place your controller in the same spot as you want the ZED to be, and rotate it the way you want it to face. Try to get it close, but don’t worry about being exact – you’ll fine-tune the numbers in the Calibration section. Then, select the controller in the “Scene” window on the editor, and write down the three values next to Position, and next to Rotation.
Tip: Double check that “forward” on your controller is the way you think it is. For example, “forward” on a Vive wand is lengthwise out the front of the handle, parallel with the track pad.
Once you’ve written down those values, stop running the game and select the ZED_GreenScreen object. Make sure not to accidentally select the camera, instead. Copy the numbers you wrote down into the Position and Rotation of the prefab.
Also, while you’re there, look for the “Tracking” value in the inspector and make sure it’s not selected—it’s checked by default. The tracking feature is useful if you want to move the camera, but otherwise it’s best left unchecked because it eats up a fair amount of resources.
ZED Positional Tracking
The ZED can track its change in position and rotation without a third controller or Tracker.
In this method, when you move the ZED in real life, the ZED’s own tracking feature will know how its position and rotation changed without the need for an externally tracked object. All the tracking is done automatically, so there’s nothing special you have to do to keep it updated.
First, you just need to put the prefab at the real-world starting point. To do that, follow the instructions in the Fixed Position section above, except you should leave the “Tracking” value checked in the prefab.
However, the ZED tracks its position and rotation relative to its starting point. Every time you end the game, the ZED has to start at the exact same place, with the exact same rotation, that you had it at before. Otherwise, the real and virtual worlds won’t line up.
To make that easier, we’ll add one extra step: mark the position of your ZED in the real world. Some suggestions for how to do this:
- If laying the ZED on a flat surface, it’s best not to use the mini tripod that came with the ZED as it’s hard to keep track of all three rotational axes. Use tape to outline its starting position.
- If you must use the mini tripod, find some way to mark all three axes, such as a piece of clay placed and shaped such that an edge of the ZED can rest on it.
- If using a (regular) tripod, put small pieces of tape under each leg of the tripod.
- If you plan on rotating the ZED on the tripod, mark on the floor the exact direction the ZED was facing at start. If you don’t plan on doing this, minimize the ZED’s ability to rotate on its own by rotating it clockwise as far as you can on the ¼” screw, and tighten any knobs on the tripod itself that may allow it to spin freely if loose.
- If you plan on adjusting the height of the tripod, wrap pieces of tape at the bottom of each rod segment where it enters the one beneath it.
Clearly mark where the camera belongs each time you start a scene.
Note: If using the new ZED mini, its internal IMU will provide the accurate rotation of the camera at all times, so marking rotation is not as critical.
With the starting position and rotation marked, start the scene, make note of how things look and move the camera around freely. You should see the virtual camera move in the same way as the real one. Once done, try putting the camera back exactly as it was before and run the scene again. If your markings worked, the scene should match up with the first time you started the scene.
Vive Tracker or Extra Controller
Because Rift and Vive tracked objects are continuously tracked from fixed points in your room, attaching such a device to your ZED is the best way to make sure your camera’s position and rotation are perfectly aligned. You can launch a scene without worrying about its starting point. And as the Rift’s Sensors or Vive’s Lighthouses provide known, fixed points of reference, the tracking is less error prone than any inside-out solution (including the ZED’s) which are designed to function when such references aren’t available.
You can use an Oculus Touch Controller, a Vive Controller, or a Vive Tracker. Whichever you pick, you’ll have to physically attach it to your ZED. There are many ways to do that. If you have access to a 3D printer, you can download our mounting brace template (.obj, .stl) and print it out. The same mount will fit a Vive Tracker, a Vive controller or an Oculus Touch.
If you don’t have access to a 3D printer, feel free to improvise with clamps, tape, etc. But you must ensure two things: First, that your attachment mechanism doesn’t occlude too many of the device’s sensors (hidden in Touch controllers, but indicated with divots in Vive Controllers and Trackers). Second, your attachment must be firm (no wobbling) or else you will have to recalibrate constantly.
Note: Our printable mounting brace does not fit the new ZED mini.
Note: If you’re not sure where on the ZED you should mount your controller or Tracker for the best results, know that it’s not especially significant as the calibration accounts for the offset. Mount it where you can best keep it steady. However, mounting it directly to or closer to the ZED will minimize the effect of any wobbling that does occur.
Now, you must add virtual objects that correspond to the tracked objects. Unlike the headsets, Unity does not have a native way to add in Touch or Vive controllers or other tracked objects without implementing code that interacts with its Input system. However, both Oculus and SteamVR (Vive) have free plugins you can download from Unity’s Asset Store that greatly simplifies this.
These plugins are Oculus Integration and SteamVR Plugin. If they are not already part of your project, go to the Asset Store tab within the Unity editor, search for the applicable plugin, and hit Import.
Next, you’ll see a window asking you which of the package’s assets you want to import, like when you imported the ZED package. You don’t need all of them to finish this tutorial, but unless you’re already familiar with the plugin, it is simplest just to leave them all selected and hit Import.
Unity will then import your package into your project, which may take some time. When importing either plugin, you may also get a window letting you know that Unity will automatically update some scripts of yours. If you’re working in an existing project with complex scripting already in place, you should consider making a backup of your project before proceeding. Otherwise, hit I made a backup. Go Ahead!
With the plugin in your project, it’s now time to add a virtual object to mirror its real counterpart. How you do this depends on what kind of object it is:
Oculus Touch Controller: If you don’t already have one in your scene, go to OVR ->Prefabs in the Project window, and drag an OVRCameraRig into your Inspector. Expand the new object, and expand the TrackingSpace object that’s been parented to it. The virtual object you will be using is either LeftHandAnchor or RightHandAnchor, depending on which controller you will be using.
Vive Controller or Tracker: If the tracked object you’ve mounted to your ZED is not already in the Hierarchy, then with nothing selected, right click anywhere in the Hierarchy and create a new Empty. Rename it to something indicating its role (we’ll use “ZEDTracker”). Then add the SteamVR_TrackedObject script to it by clicking Add Component in the Inspector.
In the component’s options, you’ll need to change the Index value to match the number of the device you want to use. However, this is assigned at runtime, so you will need to experiment to find which device it is. This is typically linked to the order that the objects (including the Lighthouses) were detected by SteamVR.
Tip: The [CameraRig] prefab in SteamVR -> Prefabs provides an easy way to manage your headset and first two controllers, visualize your bounds, and a few other features. It also intelligently assigns device numbers to your controllers, and tracked objects added according to these instructions. It’s recommended to use this prefab if it does not conflict with your project’s existing setup.
Finishing up your tracked object
Now, with either your Rift controller or Vive tracked object identified, drag your ZED prefab onto it to make it a child. That way, when the tracked object moves, Unity will know your ZED has moved by the same amount.
The last step before calibration is to disable the ZED’s own tracking system, which would otherwise conflict with the Rift’s or the Vive’s. With the ZED_GreenScreen object selected, look in the Inspecter under ZED Manager. Make sure Enable Tracking is disabled.
Note: If using a Rift, which controller is left and right will never change. But in the Vive, the ambidextrous controllers get assigned to be left or right when the game starts. If you are only using two Vive controllers total, this is important for when you parent your ZED prefab to a controller. You’ll have to position your controllers the same way when you start to make sure the correct controller is assigned.
Next: Part 4: Calibration