I’m using the local video streaming module (Local Video Streaming - Stereolabs) to produce a set of h264 (RTP) streams on machine A using an Orin NX and two ZED X Minis.
On machine B, if I use same module from the api, I’m able to successfully receive the stream (e.g. view it).
However, this requires NVIDIA hardware acceleration on machine B.
Instead, I’m trying to run a gstreamer pipeline on machine B to receive the h264 streams without hardware acceleration.
For example:
gst-launch-1.0 -v udpsrc port=30000 caps=“application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264” ! rtph264depay ! avdec_h264 ! videoconvert ! video/x-raw,format=RGB ! autovideosink
Assuming this can work, my hope is to use the gstreamer pipeline in gscam to publish image topics from machine B.
However, all of my gstreamer pipelines have failed to capture the RTP stream.
What am I missing?
Are the caps within the gstreamer pipeline correct or are there other details which I need to look out for?
I’m also not sure how separating left/right image streams in a stereo pair will work with this approach?
Thank you in advance.