When I launch three zedx cameras in Orin with Neural_Plus option, the rgb drop to 1hz, is it possible to use PC SDK to generate the pointclouds? For example, Orin stream the video to PC, and let PC produces the depth for high frequency outputs.
Hi Myzhar, the model I use is AGX jetson Orin developer 64GB version, with jetpack 5.1.1. After I enabled jetson_clock, it is still 1hz RGB output. For our application, we need synced RGB images from all three cameras and we have tried many other depth estimation methods, your neural_plus model performs the best among all. I know you have streaming functions for cameras, but I am not sure if I can get synced images with the streaming module. For our application, we need synced RGBs, it is okay to post-process the depth.
Hi @Jiahe
the NEURAL_PLUS depth mode is highly demanding in terms of required computing power.
If you use it simultaneously with 3 ZED X cameras the final rate of 1 Hz is expected.
Concerning synchronization, the ZED Link Quad capture card performs the requirements to keep the frame synchronized during capture, the local streaming will not affect this behavior.
Hi Myzhar, thanks for the quick response. So the best solution is to steam the three zed cameras in Orin and on the PC-end use three scripts to run the NEURAL_PLUS module? We hope to implement things in ROS, so we should start three nodes for the three cameras and publish the depth image for higher frequency output?
Yes, this can be a working solution.
Please note that the receiving PC should mount an NVIDIA GPU of the latest generation (RTX model) to obtain the maximum performance with NEURAL PLUS.