UE5 Live Link | determine which person is which

Hey everyone,

I’m currently working on a setup with multiple ZED 2i’s sending data to Unreal via Fusion Live Link.

A user would be able to “login” to the application via an RFID tag, and then get their personalized avatar in the 3d scene. To do that I would have to be able to determine, which of the people in the scene are who though.

My idea or hope was, that I would be able to grab the distance between difference joints, shoulder to shoulder, neck to pelvis, etc. and save those along with the users avatar. Then I could load that data and check the people in the scene to make sure each avatar is at the right person.

However I found out that depending on where in the setup a person enters the cameras view, the distances vary. Sometimes the shoulders are 35cm apart, other times (for the same person of course) they are 30 or 40cm apart.

Are there ways to have the numbers be more consistent? Or is anyone aware of other ways to determine which of the people in the scene is who? Face recognition would be optimal I supposed, but I don’t have that :smiley:

Thanks!

Hi @mov,

The consistency in the skeleton data would indeed vary, and it can be affected by factors like clothing, the position of the camera, and the one of the subject. I don’t think its accuracy is reliable enough for automatic re-identification.
We don’t have a solution out of the box, but I imagine you could implement an identification system and make it communicate with your UE5 app. For example, if you know who will enter the setup next, you can give the correct avatar.

Sorry, this is not much help even though your project does sound exciting!

Thank you for your reply, I feared as much :stuck_out_tongue:

I guess for the time being I’ll just have to try use the root position after a user logged in, thanks!