I’m having some issues with using custom avatars in the example scene “Body Tracking Multi”. When I use an avatar that I bought on the unity asset store, the joint rotations seems to be off. I’ve tried a few different characters and get this issue on a few of them.
Here’s a video of what the issue looks like: Zed_issue.mp4 - Google Drive
I have done all the steps that are outlined in the documentation. I’m guessing someone knows what happens here?
(ignore the missing shaders in the video)
The character is properly set up in the Avatar Configuration view in Unity. Here’s a picture of that: Skärmbild 2024-04-04 180249.png - Google Drive
Normal character animations work fine.
Can you share an avatar I could reproduce the issue with? (ideally a free one )
Our theory is that as long as it fits the “Humanoid” pipeline of Unity, it should work correctly, but maybe some adjustments need to be made and we have to update the doc.
I however had to add the neck bone manually in the avatar setup since that for some reason was missing. But there is a bone for the neck so if you just add that in the settings the avatar works with the zed setup.
All I did to this avatar was take one of the prefabs, add the Skeleton Handler script, set the animator controller to the EmptyIKAnimatorController and added it to the Avatar section of Zed Body Tracking Manager in the test scene, nothing else.
I’m sorry to rush you, but have you had time to look at this? We are closing in on a deadline and I need to make a decision if we can use the Zed camera at all or if we need to find another solution.
I tried a few things, and I could not find a quick fix. I suspected the bones’ orientations, but our sample avatars do not share a common avatar rigging, and they do work with the SDK animations. It could still be a lead, as the rigging of the character you sent is different from ours.
I think the issue is linked to how we handle the animation in the backend, not to a Unity configuration issue. That will necessitate more investigation on our side, sorry.
I will log it for further investigation. Thank you for the report.
@JPlou Oh ok, that’s very unfortunate. Strange that no one else has had this issue, since it seems to happen with quite a few of the avatars in the asset store?
I’m having the same issue. All 4 pack-in avatars seem fine (Unity_fbx, Remy_fbx, YBot_fbx, WhiteMesh_fbx), but every humanoid avatar we’ve tried so far from the Unity store interprets specifically the shoulder angles to be (I think) 90 degrees off on axis; e.g. holding an arm straight forward from the chest makes the avatar arm go straight up, holding an arm straight up makes the avatar arm go straight backwards.
Example free models that fail:
Example paid model that fails:
I am casually versed in the technical mechanisms of rigging and animation, but not in the tooling/flow to redo it from scratch, or in specifically Unity’s humanoid skeleton system, and I’m not seeing anything in the Avatar settings Unity-side which looks different by a matter of 90 degrees between the sample and store rigs. Has there been any further progress replicating/analyzing this from the StereoLabs end?
I am also, after a bit more digging, a little confused on how exactly the animation retargeting is expected to work. The gist of the technique appears to be letting the intended mechanics drive a controller for a known-good model with one avatar rig, and then using the same controller to tether a second model with a different avatar rig to the first. In the ZED setup, that’s already the architecture; afaict every model is going to need its own avatar but share the same EmptyIKAnimatorController. So without even being 100% certain where/how ZEDBodyTrackingManager would discern which of two child models on a prefab to hook into as the primary, I’m not sure how this would functionally diverge from the subservient model being controlled directly.
fwiw I did take a swing at “animation retargeting” and no dice. Just as a PoC I dropped a problematic avatar inside a working one and slapped a runtime controller copier script on it so they’d both have the same controller guaranteed even after instantiation. While the working avatar runs, no motion is transferred to the target avatar, I can only guess because the motion isn’t coming from a controller-driven animation but from raw bone manipulation in script.
I can also confirm that while the shoulders are the most conspicuously off, and a very hacky patch to force them 90 degrees further about their parent’s Y/up axis does improve matters, there are also quite a few smaller tangles and misalignments making Unity store avatars unusable. And even breakpointing and checking the shoulder rotations during default/rest transform capture isn’t showing anything conspicuous which would differentiate the good models from the bad.
The art team I work with is now trying to create new avatar skins based on the armatures of the provided sample models. They’ve come across an anomaly which may help explain what’s going on. Apparently, on at least one model, the pose the armature takes in “pose” mode in their editor and the pose it takes in “edit” mode are inconsistent. When they do their mesh fitting and weight painting against the “edit” pose as per their usual workflow, by the time the model is operated by the ZED modules, it has been conflated with the “pose” pose. E.g. if one pose is a traditional T pose and the other is an A pose, and they use the A pose as the bind pose because it’s a more natural fit for clothing models, then the model arms are off in ZED and I need to hold a T pose for the model to hold an A pose. This is despite the armature/avatar/skeleton all appearing normal in the Unity editing flow.