Can you make a gstreamer element with a virtual camera?

Two Questions:

  1. how do you properly set up the configuration file for streaming a virtual camera? do you both a .yml? or just a .conf? and specifically where do each of them go?

zed_calibration_115047324.yml (1.2 KB)

SN115047324.conf (755 Bytes)

How do I tell if I am supposed to be using LEFT_CAMERA_2K or in another mode (like HD)?

  1. does zed-gstreamer support virtual stereocameras?
    I don’t see this clarified in the docs anywhere or in the code. I tried running gst-launch-1.0 zedsrc ! autovideoconvert ! queue ! fpsdisplaysink and I got this output. What setup do I need to make this work?
Setting pipeline to PAUSED ...
Setting depth_mode to NONE
ERROR: from element /GstPipeline:pipeline0/GstZedSrc:zedsrc0: Failed to open camera, 'CAMERA STREAM FAILED TO START'
Additional debug info:
/home/jetson/perception/zed-gstreamer/gst-zed-src/gstzedsrc.cpp(2343): gst_zedsrc_start (): /GstPipeline:pipeline0/GstZedSrc:zedsrc0
ERROR: pipeline doesn't want to preroll.
ERROR: from element /GstPipeline:pipeline0/GstZedSrc:zedsrc0: GStreamer error: state change failed and some element failed to post a proper error message with the reason for the failure.
Additional debug info:
../libs/gst/base/gstbasesrc.c(3556): gst_base_src_start (): /GstPipeline:pipeline0/GstZedSrc:zedsrc0:
Failed to start
ERROR: pipeline doesn't want to preroll.
Failed to set pipeline to PAUSED.
Setting pipeline to NULL ...
Freeing pipeline ...

Hi @adinesh
this is only supported by using the ZED Media Server by setting the input-stream-ip and input-stream-port zedsrc parameters.
The parameter opencv-calibration-file can be used to specify the calibration file.

Would it be possible for StereoLabs to develop that? Or could I start developing that if the code is available? Even if there are simply CLI arguments you can add to the launch of ZED_MEDIA_SERVER that will let you set the input-stream-ip , input-stream-port, and opencv-calibration-file so you can run the whole thing headlessly or using X11 forwarding (without a keyboard and mouse).

Moreover, is there any way to stream via a different protocol? like SRT instead of RTP?

Also, is there a way to send a stream from a virtual camera, and on the receiver side, ingest that stream as a Gstreamer pipeline? This is in the set of scenarios where the reciever (like an IPad) can’t install the SDK. how does the ZED streaming work under the hood? is it making a GStreamer stream?