Creating a ZED Live Link Project

We provide you an Unreal Project allowing to directly animate avatars with ZED Live Link plugin skeleton data. This section will explain how this project is built, and how it transforms skeleton data coming from our ZED Live Link plugin to an animation playing on the Unreal scene.

Project Structure

We first present the main assets the project contains and that allows to animate avatars:

  • Three avatars already imported in the project to be animated. They come from Mixamo, and are stored in Content/Mannequin/Character/Mesh. The tutorial Animate New Avatars explains in detail how to import your own avatars.
  • An Animation Blueprint for each of these avatars. It’s this Animation Blueprint that enables Live Link skeleton data to animate the avatar, and it’s also inside this Blueprint that transformations of Live Link data can be done. We’ll explain these transformations in the next section.
  • An Avatar_livelink asset: the role of this asset is to define which avatar will be animated in the Unreal scene, and which Animation Blueprint will be used.
  • The Level Blueprint: this Blueprint automatically connects the project to the Live Link Source at Play, and a Avatar_livelink Actor is instantiated for each skeleton sent by the Live Link plugin, in order to be rendered on the Unreal scene.
  • The C++ class LiveLinkOrientationsRemapAsset that inherits from LiveLinkRemapAsset, and whose role is to apply several transformations to Live Link skeleton data before applying it to the avatar.
  • The Blueprint class MixamoRemap, whose parent class is LiveLinkOrientationsRemapAsset, and whose role is to specify the correspondences between our Live Link plugin bone names and the avatar bone names.

The next section will explain what transformations are necessary in Unreal to use Live Link skeleton data.

Required Transformations of ZED SDK Raw Data in Unreal

To animate an avatar inside an Unreal project using raw SDK data fed by the ZED Live Link plugin, some operations must be done. The Unreal project we provide you already implements these transformations, and this section will explain how they are implemented. All these transformations are done using a Remap Asset.

The idea is first to create a new C++ class in your project, and select LiveLinkRemapAsset as its parent class.

In this class you’ll be able to implement the following operations:

  • Apply only Live Link rotations without modifying translations, except for the root. Indeed, Live Link default behaviour is to apply both translation and rotation data to the avatar, but you might prefer to apply only rotations and keep the local translation of the avatar to avoid deforming it.

  • Change the root translation to adapt to your avatar’s scale and make sure the avatar still has its feet on the floor. If an avatar is two times bigger than the skeleton output by the SDK, we will need to multiply the root translation by 2. There are two parameters to be handled here: first, the actual leg size of the avatar compared to the leg size of the skeleton coming from the SDK. You can get a first multiplication factor by checking these two sizes directly on the bones of the avatar. But the avatar might also have a scale not equal to 1. In this case, this scale won’t actually alter the bone size of the avatar, so you need to take this scale into account in addition to the first factor. Consequently, the multiplication factor to apply to the root translation will be (avatarLegSize / SDKLegSize) * avatarScale.

  • Apply extra rotations in case the avatar’s T-Pose doesn’t correspond to a pose where all its local orientations are null.

Let’s explain a bit more the last point. Raw SDK orientations fed by Live Link are output to be applied on an avatar who is in T-Pose when all its local orientations are null. T-Pose represents a pose where the avatar is standing up with its arms held horizontally. The problem is that Unreal works with the concept of Reference Pose. The Reference Pose of an avatar is the default pose of this avatar, the one you can see when you open its skeleton asset window. But a Reference Pose in Unreal doesn’t necessarily correspond to a pose where all orientations are null. In practice, a lot of avatars you can import in Unreal will have a Reference Pose that is a T-Pose as excepted, but won’t necessarily correspond to null orientations.

For instance, if you import this avatar from Mixamo, you’ll see its Reference Pose is a T-Pose:

But by putting all its local orientations to 0, the resulting pose will be this:

In order to have an Unreal project that generalizes to a lot of avatars, it can be interesting to apply extra rotations to animate all avatars whose Reference Pose is a T-Pose.

So the idea is first to apply the actual rotation that puts the avatar from its “null orientations” pose to its T-Pose. This rotation information can be retrieved using FBaseCompactPose::GetRefPose.

The problem is that after applying these rotations, the avatar is in T-Pose, but the rotated joints as well as their children won’t be expressed in the same coordinate frame as before. For instance, you can see here that the avatar’s left shoulder coordinate frame has been rotated with the same rotation applied to the joint:

The SDK orientations are expressed in LEFT_HANDED_Z_UP coordinate frame with the avatar facing the +X axis, so we can’t directly apply the orientations on this new coordinate frame. To apply rotations correctly, we must do the following:

  • For each joint, apply the kinematic chain of all its parents joint rotation in order to put it in the right coordinate frame. For a given joint whose parent is the joint jn, we will apply R_j0 * R_j1 * … R_jn where jn-1 is the parent of jn.
  • Apply the SDK rotations, now that we are in the right coordinate frames.
  • Apply the inverse kinematic chain computed in the first step to put the joints back in their coordinate frame.

You can find in the Unreal Project we provide you an implementation of these transformations that is directly useable: LiveLinkOrientationsRemapAsset

Next, if your avatar bones don’t have the same name than the ZED SDK bone names (explained in the first section), you need to create a new Blueprint class, which parent class must be the C++ Remap asset class you just created.

Open this blueprint class and override the function “GetRemappedBoneNames”. There you must do a similar remapping than on the picture below, where bones explained in the first section must be mapped to their name in your imported avatar. The bone names in the “Switch on Name” node correspond to the bone names given by our Live Link plugin, and the bone names in the “Return Node” nodes correspond to your imported avatar bone names. The Unreal project we provided you has an example of such a Remap Asset useable for Mixamo avatars, called MixamoRemap.

Once done, you must apply this Remap Asset to the Live Link data. Data from Live Link can be applied directly inside an Animation Blueprint, and when creating an Animation Blueprint for your avatar, you’ll have to include a Live Link Pose node. This is where you can put the custom Remap Asset you created to transform the skeleton data coming from Live Link.

Implementing the above transformations will allow you to have a similar behaviour than with our provided Unreal project. In the Getting Started and Animate New Avatars tutorials, we assume you’re using this provided Unreal project.