Using compositor in Gstreamer to merge imagesequence with video/audio stream

19 views Asked by At

I am working on a streaming application which streams video and audio from a Raspberry Pi. I want to overlay some sensor data on the video and after many attempts with textoverlay I am trying to use imagesequencesrc instead, with the intent of having my sensor script output an image with the sensor data at regular intervals and then displaying that on top of the video with imagesequencesrc.

I have gotten this to work on the Raspberry's desktop with this pipeline:

<gst-launch-1.0 imagesequencesrc location=/home/pi/amazon-kinesis-video-streams-webrtc-sdk-c/build/samples/image-%05d.png start-index=1 stop-index=-1 framerate=1/1 ! decodebin ! compositor name=comp ! autovideosink videotestsrc ! video/x-raw, framerate=\(fraction\)5/1, width=320, height=240 ! comp.>

But in my application I am using the <gst_parse_launch> function as instructed in the Gstreamer documentation, and I just can't get it to work.

I have tried several pipelines, to no avail:

pipeline = gst_parse_launch("imagesequencesrc location=/home/pi/amazon-kinesis-video-streams-webrtc-sdk-c/build/samples/image-%05d.png start-index=1 stop-index=-1 framerate=1/1 ! decodebin ! videoconvert ! compositor name=comp !" 
                            "appsink sync=TRUE emit-signals=TRUE name appsink-imageseq autovideosrc ! clockoverlay ! queue ! videoconvert ! video/x-raw,width=1280,height=720,framerate=25/1 !"
                            "x264enc bframes=0 speed-preset=veryfast bitrate=2048 byte-stream=TRUE tune=zerolatency !"
                            "video/x-h264,stream-format=byte-stream,alignment=au,profile=baseline ! appsink sync=TRUE emit-signals=TRUE "
                            "name=appsink-video autoaudiosrc ! "
                            "queue leaky=2 max-size-buffers=400 ! audioconvert ! audioresample ! opusenc ! "
                            "audio/x-opus,rate=48000,channels=2 ! appsink sync=TRUE emit-signals=TRUE name=appsink-audio ! comp. !" ,
                            &error);

Results in

GStreamer-CRITICAL **: 10:57:36.916: gst_element_link_pads_filtered: assertion 'GST_IS_BIN (parent)' failed

Moving the comp. command before the x264 encoding to have compositor merge the imagesequence and video before encoding has the application run without errors, but no video or audio is displayed.

pipeline = gst_parse_launch("imagesequencesrc location=/home/pi/amazon-kinesis-video-streams-webrtc-sdk-c/build/samples/image-%05d.png start-index=1 stop-index=-1 framerate=1/1 ! decodebin ! videoconvert ! compositor name=comp !" 
                            "appsink sync=TRUE emit-signals=TRUE name appsink-imageseq autovideosrc ! clockoverlay ! queue ! videoconvert ! video/x-raw,width=1280,height=720,framerate=25/1 ! comp. !"
                            "x264enc bframes=0 speed-preset=veryfast bitrate=2048 byte-stream=TRUE tune=zerolatency !"
                            "video/x-h264,stream-format=byte-stream,alignment=au,profile=baseline ! appsink sync=TRUE emit-signals=TRUE "
                            "name=appsink-video autoaudiosrc ! "
                            "queue leaky=2 max-size-buffers=400 ! audioconvert ! audioresample ! opusenc ! "
                            "audio/x-opus,rate=48000,channels=2 ! appsink sync=TRUE emit-signals=TRUE name=appsink-audio !" ,
                            &error);

Using autovideosink instead of appsink to link the imagesequencesrc element to autovideosrc results in a working video and audio stream that is transmitted to the webservice (AWS KVS WebRTC), but the imagesequence is displayed on the Raspberry's desktop in a 1280x720 window, without the video. So the imagesequence is not merged with the video/audio, only rendered locally.

I expected compositor to at least merge the images with the x-raw video before encoding and linking audio, but I was wrong. I have attempted to move the comp. command throughout different stages of the pipeline, but it does not work as intended. I feel like I am just misusing the compositor syntax, but I can't figure it out despite hours in the GStreamer documentation. Grateful for any suggestions or insight.

0

There are 0 answers