[Unity Livelink] Position and rotate avatars to the physical room

I have a digital copy in Unity of a physical room that contains 6 ZED cameras. I want to use live link to display avatars in Unity at the same position they are standing in the actual room. The raw data from the cameras seems to be given with respect to the position and orientation of one of the cameras. How can I alter the avatar’s position and orientation to fit that of the room. Applying a positional offset (livePositionOffset) is fine, and applying an orientation offset (liveRotationOffset) to the global root orientation works when standing still, however, when moving with an orientation offset, the avatar is moving in a different direction than its rotation. In the video clip below, the avatar is facing the correct direction and should be moving straight forward, but due to the 90 degree orientation offset it moves 90 degrees off. How can I fix this? Below is the code in ZEDBodyTrackingManager.cs with a slightly adjusted UpdateAvatarControl (I have simply added offsets to position and orientation).

    private void UpdateAvatarControl(SkeletonHandler handler, sl.BodyData data)
    {
        Vector3[] worldJointsPos = new Vector3[handler.currentKeypointsCount];
        Quaternion[] normalizedLocalJointsRot = new Quaternion[handler.currentKeypointsCount];

        for (int i = 0; i < worldJointsPos.Length; i++)
        {
            worldJointsPos[i] = data.keypoint[i] + livePositionOffset;

            Quaternion localQuaternion = data.local_orientation_per_joint[i];

            normalizedLocalJointsRot[i] = localQuaternion.normalized;
        }
        Quaternion worldGlobalRotation = Quaternion.Euler(liveRotationOffset) * data.global_root_orientation;

@JPlou May have to do something similar to what you described here How are joint transformations and scaling performed in livelink for Unity and Unreal? - #6 by JPlou ?

Hi @haakonflaar,
I guess it does in a way, it’s about applying the correct transformation.
In your code, you add a positional offset to your position, but it will only work if the real and virtual reference cameras are oriented the same way.

I think you have to apply the rotation between the 2 cameras to your position vector and then offset it.
Something like:
worldJointsPos[i] = ( liveRotationOffset * data.keypoint[i] ) + livePositionOffset;