Problem with nodal offset for Virtual Production with calib plugin from UE

Hi there.
I began my loong day to Virtual production with a Zed2i.
I made livelink work perfectly with UE 4.27 and 5, the cam provides really really strong tracking without any markers or so in my studio so for people wondering if they should buy it to do VP, for me it’s a yes.
Now i ran into a little problem.
When i followed tutorials from UE or other people on the net using the lens calibration plugin, it worked until the very end… except the last step.
I got my distorsion and so on good, with the trick for FIZDATA etc but my issue is about NODAL OFFSET.
If i understood correctly, the plugin bult in UE does it kinda automatically with different methods (points, aruko markers and so on).
My issue is that these solutions wont work for me, as i asked on some FB group they told me i would need at least 2 trackers.

I found this, the guys uses a T265 and somehow (i’m not a programmer at all) the T265 recognize the aruko marker, allowing him to have at the end his nodal offset calibred and being able to use theses markers setting up the floor for VP like shown in this video : https://youtu.be/DQT0Qy856mA?t=251
As i mentionned, i’m learning all of theses complex technical stuff and i’m wondering if i can use the same solution as the guy using T265 (compatibility?), or if you guys can explain if i can/can’t use aruko marker and why.
Same topic but this is really important to get the Aruko markers for VP workflow as it really reduce by 10x the time for setup the actor in the scene using livelink.
This is the last step who remain very mysterious for me, even after grinding 10h a week to get all working pretty fine in the studio for both live and only previz stuff.

Hi,

Thanks for reaching us out.

For the moment we do not provide solutions to compute the transform between our Camera and an external Camera.
We are discussing a lot internally about how we can provide the best tools for Virtual Production applications like what you did, so do not hesitate to give us your feedback, it’ll help us going in the right direction.

I’ve not tested this plugin yet but according to the documentation, the plugin only requires the position/orientation of the camera and the camera intrinsic parameters.
We are currently only sharing the camera aspect ratio with the livelink sample (here : https://github.com/stereolabs/zed-livelink-plugin/blob/main/Source/Private/main.cpp#L336).
The plugin might requires more parameters than that.

What exactly is happening when you say this solution does not work for you ? Do you have a wrong result ? or not result at all ?

One solution could be to compute this transform outside of Unreal, using the same Aruco detection method (with opencv).

Best,
Benjamin V.

Stereolabs Support

Hello !
I’m gonna try to explain to you (basically) what is missing with my little knowledge.
First is the framerate.
You need to let people switch the framerate of Zed2i if possible, for letting us having the same framerate as our camera.
Lets say i wanna shoot in 25 fps i’m basically f…
In my case, FX6 camera can’t do 30 fps only 29,97 for example, and i can’t tweak the Zed2i (not a dev) to output 29,97 or 60 fps inside livelink for example.
I don’t know if it’s possible or so but this is an issue.
2nd problem is about detecting aruco marker.
There is a way of doing it with T265 like shown here : https://www.youtube.com/watch?v=xMMGAq3e96U&t=82s
In my case, i tried but it fucked up the nodal offset result so i did with another technique wich is not really precise.
Thats sad because i had really good tracking results and i think it can work flawlessly.
I can do it with Open CV i’ll try but if you want your product to be considered as a reference you’ll need the plug and play part of it.
In the workflow inside UE, aruco id_08 is used for setting up the floor, not working for me. So i need to do it by hand but its tedious and the trick i’m using isnt reliable enought (i parent a “null object” to the camera and offset it like in my real rig. Then i add the livelink to the “null object”). It’s messy.
So what i’m suggesting is rather than blueprint or so, just a software dedicated to VP.
For example vanishing vector is a company who provide T265 + zed camera and the software is kinda easy to use. They don’t sell it without theyr bundle so… Can’t help me.
Same for t265, REtracker provides a solution for people to have really good tracking without needing to be technical about it.
I think that’s what you missing, an easy going solution for people who want to do VP.

Le mar. 17 mai 2022 à 15:07, Benjamin Vallon (Stereolabs) support@stereolabs.com a écrit :

Hi,

First, we want to thank you for your feedback.
It will really help us a lot to develop the best Virtual Production tools possible.

We are aware that our current solution is not complete at all. We are basically only providing a solution to send the camera’s position and orientation to UE, and that’s all.

It’s clearly something we want to develop in the near future and be able to provide a full workflow for Virtual production applications.

Best,
Benjamin Vallon

Stereolabs Support

Hi.
Thanks for your answer !
As far as i understand the workflow for this if you can just enable in your SDK or somehow the fact that the tracker recognize aruco marker inside UE, it would unlock me and some people i’v seen trying to find the same on differents forums/discord.
You can see on the link provided for T265 with unreal and kalman filter how it’s done, it should be the same in our case but with a Zed2i of course. They use some blueprint too.
As i mentionned before, there isn’t to my knowledge lot of workflow inside UE for calibrating the offset of tracker and the floor (very important).
Whathever you want to do, it should be as simple as allowing aruco marker to be recognized because most of us will use them to set up the floor (basically saying hey tracker, this is the floor).
When you do that, the virtual camera automatically set up the right position/orientation/Zaxis to match your real camera feed.
I’m in love with this camera for real, didn’t add any tracking stuff and it works crazy crazy good no jitter (or really not a lot only sometimes).
Regards
Pablo Jouglens.

NB : For getting rock solid track, would i need to add reflective tapes on the ground or so? im in a room with a “cubic” like shape all green where i’m looking.
NB 2 : Would it be possible to recover the tracking data that you send to unreal as a separate file?

Le mer. 18 mai 2022 à 14:45, Benjamin Vallon (Stereolabs) support@stereolabs.com a écrit :

Hi,

There is not such thing as “unlocking” aruco detection in the SDK. From what I saw in the video, only the camera position/orientation and some intrinsics parameters are sent to UE with live link from the T265, the image is not even sent.
As we are sending the same type of data, our camera should be compatible (unless I’m missing something).

The best way to improve visual tracking is to avoid uniform walls/colors (no matching points). Adding textures, objects can help a lot.
We do not have an option to automatically save the tracking data into a file but that’s a really interesting feature we should discuss about.

Best,
Benjamin Vallon

Stereolabs Support

Hello,
I’m on the same search about finding a way to navigate to a scene with aruco detection
would be amazing if the SDK can allow something like the T265

about Viensvite message for frame rate, my 2 cents is that’s not a good idea until you are able to genlock the tracker to the camera, overwise, it will cause more trouble to slow down the framerate.

a feature that can be interesting too is to reverse the camera.
mounting the ZED2i on a camera rig is not always easy, it would be great if we can put it heads down and invert data. why inverting ? as the mounting point is only at the bottom, it’s use less rigging part to put it head down

thanks

Hi,

Thanks for your feedbacks.

You should be able to put the camera heads down without tracking issues. Moreover you have access to a Init Parameter called “flip_mode” which allow you to flip the Camera images and data if necessary (Video Module | API Reference | Stereolabs). By default, it is set to “Auto” which means that the SDK will use the IMU data to detect if the camera is flipped or not.

Best,
Benjamin Vallon

hi, i’m on the journey with zed2 and ue5 again too, so here is what is missing, all though not all of that is zed2 related…

nodal offset:
this is the difference (or offset) in location and rotation between the zed2 mounted on the actual film camera and the lens pupil entrance point (changes with every lens… therefore, calculating the offset to film camera sensor center might be a good start).
however, there are tools (i think with the use of opencv for example) to take an image of both the film camera and the zed2 camera (in case of zed2, i think the left camera) and use both images (after applying correct lens distortion) to calculate the offset between the two.

aruco markers:
there is a tutorial how to place the zed2 in any environment by aruco markers, correct?
so, one can use an aruco marker and place it somewhere in the world as a 0,0,0 reference point.

frame rate:
the zed2 is till fusing imu data with camera data (sorry for lack of better description), right?
form what i know from other sensors, both data streams (even if they are at different frame rates / HZ) are fused into one stream.
here we would need the ability to define at what “frame rate” the stream should be send via livelink to unreal.
to sync this perfectly, the sender system (pc with zed2 app running) would need to be genlocked (via an sdi capture card for example) to the overall project frame rate, where also the film camera, as well as all unreal engine computers and other video equipment are synced to.

i would love to have a more in depth, maybe even personal conversation with someone at Stereolabs, if you want to know more what is actually needed for a virtual production workflow.

best,
Roman

HI, I want to be like you, quickly identify floors through ArUco. Do lens offset ~ ~ please ask you successfully? Can you share the work in Zed2I+ArUco and then UE5

1 Like

Me too,i find zed livelink plugin,and a exe can send zed position by livelink,how can i make a exe to send aruco marker position by livelink?