Question on Running ZED with ROS and DeepStream SDK

Hi, I am looking to perform CV using DeepStream SDK but I also require running a ROS script to obtain depth information. However, when I run roslaunch zed_wrapper zed.launch I am unable to use access /dev/video0 on DeepStream SDK which will output an error whereby the device is busy. Similarly, starting up DeepStream prior to ROS will result in a failure to startup on the ROS side (Camera Stream Failed to Setup). The two modules work perfectly fine individually.

I have attempted to resolve this issue by streaming the camera feed out of ROS through RTSP as in the following repo https://github.com/CircusMonkey/ros_rtsp but this has resulted in video feeds that were low in resolution or introduced high latency.

Could I get some assistance as to whether it would be possible to consume data from the ZED on both ROS and on DeepStream?

As an add-on, I am also currently trying to resolve this issue by duplicating the video devices (split /dev/video0 into /dev/video2 and /dev/video3), a method I found in here https://unix.stackexchange.com/questions/343832/how-to-read-a-webcam-that-is-already-used-by-a-background-capture. I have thus far been successful in using the duplicate video device to run DeepStream but am unable to run ROS as I am unable to change the video input device from /dev/video0 (which is being consumed by the v4l2loopback package) to /dev/video3.

Hi @1simjustin
the ZED Camera cannot be opened by different processes running at the same time.
A workaround is to use the streaming module to open the camera in a process and stream out the data for other processes.

Hi @Myzhar
Thanks for your quick reply! I am relatively new to using the ZED camera. Could you provide some tips on how to set the streaming module up such that I will be able to obtain an image stream to be fed into the DeepStream pipeline from the receiver?
I assume i should be using GStreamer on the received stream but I have been unable to configure the GStreamer to run on the received stream, resulting in “Camera Stream Failed To Start” errors.

Here you can find the Streaming module documentation:
https://www.stereolabs.com/docs/video/streaming/

Hi @Myzhar
From my understanding of the documentation, it seems that the streaming module is used for streaming camera metadata (to another device, usually for external processing). However, I am looking for a way to output the stream of the camera such that it can be consumed by DeepStream (through rtsp), while at the same time being able to obtain depth metadata. With the camera streaming module, I am unable to obtain an address and port nor a communication protocol to which i can connect with.

You can use the streaming as a “pipe” to retrieve ZED data in a second process that runs in parallel to the process that opened the camera and took control over it.