Green Screen VR Capture Capturing VR in Mixed Reality In this tutorial, you will learn how to capture VR applications in mixed-reality using the ZED, an HTC Vive or Oculus Rift and a green screen. For an in-depth article on capuring VR content with an external camera, read this tutorial. The main advantage of using depth data from the ZED camera is pixel occlusion and lighting. This means that the depth of every pixel in the camera’s view will be compared against its corresponding virtual pixel, so the nearest one will be rendered. In addition, you can enhance your real scene with virtual lights and shadows. Requirements For this tutorial, you will need the following items: ZED camera Green screen HTC Vive or Oculus Rift headset 2x Vive controllers, 1x Vive Tracker and 2x controllers, or 2x Oculus Touch controllers. First, you will need to mount a controller/tracker onto the ZED and make sure they are firmly attached. The controller will let you track the ZED in the same space as the HMD and allow you to move the camera while filming. To attach the ZED and controller/tracker from Oculus or Vive, you can 3D print the following mount (download STL, OBJ). If you don’t have access to a 3D printer, you can attach the devices with clamps, tape, etc. But you must ensure two things: First, that your attachment mechanism doesn’t occlude too many of the controller/tracker’s sensors. Second, your attachment must be firm (no wobbling) or else you will have to recalibrate constantly. How It Works To start recording mixed reality footage, we will go through the following steps: Create a new Unity project and import the ZED_Green_Screen prefab. Adjust ZED camera settings to improve image quality. Adjust chroma key to remove the green screen. Add a VR headset. Note: If you experience random freezes while using Vive with ZED, update your NVIDIA drivers (version 390.77 or greater for GeForce cards) to fix the issue. Read more. Create a new Project Open your Unity project, or create a new project. Follow the instructions on importing the ZED Unity package. In Edit > Project settings > Player, check Virtual Reality Supported. In the Project panel, navigate to Asset > ZED > Examples > GreenScreen > Prefabs. Drag and drop the ZED_GreenScreen.prefab in the Hierarchy view. In the Hierarchy view, select ZED_ GreenScreen > Camera_Left. Click on the Play button. Your Unity interface should look like this: Adjusting Camera Parameters In the ZED_ GreenScreen > ZED Manager (Script), make sure the camera is set in HD1080 mode (1920*1080 resolution). Changing the resolution requires to restart the scene. After selecting ZED_ GreenScreen > Camera_Left, click on the Camera Control button of the Green Screen Manager in the Inspector view. By default, the camera exposure and white balance are adjusted automatically. Auto White Balance tends to add a green tint to the picture when placing the ZED in front of a green screen. Since you’re shooting in a controlled environment with fixed lighting, you can disable the Automatic settings and manually adjust the parameters of the Camera Control panel. Hit Save once you’re happy with your settings, so they can be easily loaded later after restarting Unity. Adjusting Chroma Key Settings In the Inspector panel, you can adjust your chroma key settings with the Green Screen Manager script. Below are explanations of what each setting does and when to adjust it. Views Select the FOREGROUND view to get a better feel for what remains visible in your scene and what doesn’t. This can be useful for setting the critical settings in the next section. Change the view to FINAL to see the result. Color Range The Color, Range and Smoothness values are the most important and where you’ll likely spend the most time adjusting. When the script goes through each frame, it will check each pixel’s color to see if it should turn it invisible. If you’re using a background that’s not green, adjust the Color setting to match. Otherwise, adjust the shade of green to better fit the screen’s exact color and your lighting setup. Range indicates how close a color on the spectrum can be to the exact color of your background. You need to add some range because even the highest quality green screen studio has shading to it. Smoothness defines how colors just outside the range get blended in. With zero smoothness, a color just outside the range is completely visible. With more smoothness, it will be partially visible. You’ll get further control over how smoothness effects the image in the next section. The easiest way to find a good Range and Smoothness is to start the scene, set both values to 0, adjust the Range, then adjust the Smoothness. Refining the Matte Next, refine the edge of the matte with the Erosion, Black Clip, White Clip, Edge Softness and Despill settings. The ALPHA view can be helpful for making these adjustments. The Erosion setting erodes the edges of the matte to remove the remaining green areas. Without erosion With erosion White Clip and Black Clip: Alpha values above the White Clip value will be set to 1 (fully opaque), and alpha values below the Black Clip value will be set to 0 (fully transparent). In other words, White Clip makes nearly opaque pixels fully opaque, and Black Clip does the same for nearly invisible pixels. This helps you remove noise that results from your smoothness setting. Without white clip With white clip Despill reduces the green screen color reflections on the key. Increase this if you have a slight green hue showing on your subject, but be careful not to use more than you need or the other colors will appear less vibrant. Change the view to FINAL before adjusting color despill. Without despill With despill Finally, store your chroma key parameters by clicking on Save. Garbage Matte The Garbage matte script lets you extend the virtual world beyond the green screen’s limited size. It allows you to define a region in space and exclude image pixels that do not belong to this region. To use camera tracking with the garbage matte (as we don’t do in this demo as we’re using the headset) you need to activate tracking before setting your garbage matte. Tracking can be activated by clicking on the ZED_Greenscreen prefab, and enabling it in the ZED Manager compponent. Click on Enable Garbage Matte Start your scene. Make sure Place Markers is selected. Define your green screen region by clicking on the image in your Game display window to place your markers. You should see spheres appear at each place you click. Repeat the previous step until the whole green screen area is covered with a white mesh. Click on Apply Garbage Matte to exclude the outside region of the mesh (set to transparent). To save and load markers, use the Save/Load option. Note: If the area is not covered correctly, use Remove Last Marker to disable the garbage matte and remove the last marker. You can also adjust the position of a specific marker with a right click to select a sphere and a left click on a new position. Adding the VR headset Interfacing the ZED with SteamVR SteamVR is required for the HTC Vive and optional for the Oculus Rift. Download and import the SteamVR Unity package. Add the [CameraRig] prefab located in the folder SteamVR / Prefabs. Right click in the Hierarchy view, and then on Create Empty to create a GameObject. Drag the ZED_GreenScreen prefab and drop it on the GameObject. Then make sure its Position and Rotation values are all zero in the Inspector panel. To film someone in a VR headset, you’ll need to calibrate the ZED’s virtual position and rotation so that it matches the ZED’s real position relative to the VR headset’s tracked space. We’ll use SteamVR to track the object the ZED’s attached to. Download and import the SteamVR Unity package. Select the ZED_GreenScreen object and add the ZEDControllerTracker script. If the component window in the Inspector only displays a “Load data” button, press it. It should now look like this: Change the Device to Track option to match the object to which you’ve attached your ZED. Drag and drop the ZED_GreenScreen root object into the Zed Manager field. This will help the plugin determine not to use latency compensation designed for controllers on the ZED, which can cause unwanted movements when tracking the ZED itself. Lastly, select the ZED_GreenScreen object, and add a ZEDOffsetController component. This will move the virtual ZED to match the tracked object found in the calibration file we’re about to create. Calibrating the camera Manually calibrating the ZED can be tedious, so we’ll use a (beta) calibration app Stereolabs provides for SteamVR. Download it here and run it. Note that you need CUDA 9.1 (exactly) to run the calibration app. You can download it here. This will not conflict with later versions of CUDA if you have both installed. Select Pad if using a controller, Tracker if using a Vive Tracker, and None if you haven’t attached the ZED to a tracked object. Then click Validate. Put on your headset and stand in front of the ZED. You’ll see a “point cloud” version of what the ZED sees, including you, but it won’t be lined up with reality. Your controller will have a UI floating over it with X, Y and Z values. Using the thumbstick/trackpad, adjust them until the point cloud roughly lines up. Hold down the trigger to get to the next step. Now you’ll supply the app with some reference points it’ll use to perform the calibration. You’ll use the 2D video feed on the floating virtual screen to synchronize each point with the real world. Look for a blue sphere and put your controller inside it. I’ll turn green. Pull the trigger. The 2D screen’s video feed will freeze, except for the virtual representations of the controllers. Move the controller until its virtual image is perfectly lined up with the real image. Pull the trigger to move onto the next point. If the point cloud ever jumps out of place after adding a new point, press the grip button to undo it and try again. Once you’ve done all the points it asks for, press the menu button and take off the headset. You’ll see a screen like this: Close the app, go back to Unity and run the scene. The calibration file will load automatically. You’ll now see your subject in the virtual world as if they were really there.