My goal is to have one device stream the ZED capture and another receive it. Both devices will record the output into an SVO2 file to compare the outputs.
My question is: What’s the best way to compare the recording files?
So far, I’ve tried to export it to L(RGB) + R(Depth), which is not ideal for my case, but I can manage. (Ideally, I would export an avi/mp4 file of the same output as in-depth sense) but and compare it using VMAF. The issue is that both videos need to be exactly the same timeline-wise (start and end at the same millisecond), or the final result will be affected.
I’ve already tried cutting it manually using video editing software, but I’m pretty sure the exported version contains defects.
The SVO format only saves sensor data, i.e. image and IMU + additional sensors from the ZED camera. For this reason, comparing the depth information between two SVOs would not be as relevant.
What I would suggest is comparing the L+R images with the same timestamps with an image similarity score.
These images won’t be exactly the same, as streaming the images through the SDK uses NVENC encoding which is a lossy GPU encoding.
You can use our tool ZED_SVO_Editor, with the -cut option. You can provide a frame number or a timestamp to cut the beginning or the end of the SVO as you wish.
You can write a script that compares each image, and to sync the images from each SVO, you can use the getTimestamp(sl::TIME_REFERENCE::IMAGE) method. The timestamps between the SVO recorded on the host will have the same values as the timestamps on the receiver, for a same point in time.