Hey all,
I want to use up to four ZED2 cameras in my project. I understand that there are two approaches to get the data (depth, color and point clouds) from the cameras, either by plugging them all into one host PC or by connecting the sensors to individual boxes (e.g. Jetson boxes) and connecting these boxes via network hub to another PC, where the data will be aggregated.
My questions are:
a) Is it even possible and if so, what are the hardware requirements to operate four ZED2 devices on a single host PC? I know I need different USB controllers for each device, but how much (V)RAM and other stuff is required to get data from all four sensors?
b) Using the second (network-based) approach, what software needs to run on the individual “satellite” boxes, is there a tool that simply forwards the sensor interface and data via network to the controlling PC or do I have to build something using the SDK myself?
c) Do both the “satellite” boxes and the controlling PC need NVIDIA graphics and CUDA support?
In testing on my machines I find that I can operate 3 at 2k res with one additional at VGA. If all are set at VGA resolution then I can do five concurrently. Measured USB bandwidth per camera is approx 200mbit/s at 2k/15fps or approx 250mbit/s at 1080/30fps, which is only a fraction of the theoretical bandwidth of USB3, but the SL API will throw a USB bandwidth error when you approach 1Gbps in my experince