this is my first time asking a question here.
I'm trying to create a GStreamer pipeline using the gi.repository
from Python3. I want to make it this way instead of using deepstream-app
or gst-launch-1.0
application because in the future I want to use Nvidia modules and execute a DNN to detect objects in the streaming video and I think that this way it will be easier to extract the metadata from the DNN inference.
Looking around I found this pipeline:
v4l2src device=/dev/video0 ! video/x-raw ! videoconvert ! x264enc ! h264parse config-interval=3 ! qtmux ! filesink location=video.mp4
It works perfect and create a .mp4 file from the video of my USB webcam when I execute it using gst-launch-1.0
application or using the the function Gst.parse_launch("pipeline")
from Gst in gi.repository
.
But when I try to make the pipeline creating each element individually and linking them in a Python code, I can't make it work. It creates a .mp4 file of size around 1 MB, althougth I can't reproduce it. It returns the error 0xc00d36c4
in the Video app from Windows10 and nothing with VLC.
It may be something related to the muxer element (qtmux or mp4mux), because I have seen other example that links the sink of the muxer to the source of the previous element in the pipeline. It has something to do with dinamic pads, but I don't really understand when they are necessary.
I will leave parts of my code here, so you can guide me where the problem is. The time I tried to link the source of the parse with the sink of the muxer is commented, beacuse I get the same result without doing it.
The imports.
import sys
import gi
gi.require_version('Gst', '1.0')
from gi.repository import GLib, Gst
import time
import pyds
The creation of the pipeline and its elements.
Gst.init(None)
player=Gst.Pipeline.new("player")
print("Pipeline created")
v4l2Source=Gst.ElementFactory.make("v4l2src","v4l2Source")
v4l2Source.set_property("device", "/dev/video0")
print("USB cam source created")
caps = Gst.Caps.from_string("video/x-raw")
filter= Gst.ElementFactory.make("capsfilter", "filter")
filter.set_property("caps", caps)
videoconvert = Gst.ElementFactory.make("videoconvert", "converter")
encoder = Gst.ElementFactory.make("x264enc","venc")
parser=Gst.ElementFactory.make("h264parse","parser")
parser.set_property("config-interval", 3)
muxer=Gst.ElementFactory.make("mp4mux","muxer")
filesink=Gst.ElementFactory.make("filesink","sinker")
filesink.set_property("location","pvideo.mp4")
print("All elements created")
i=0
for ele in [v4l2Source,filter,videoconvert,encoder,parser,muxer,filesink]:
print(f"Element {i} added to the pipeline")
player.add(ele)
i+=1
print("All Elements added to the pipeline")
The linking of every element. The error is probably in this part. As I mentioned before, the pipeline works if executed with gst-launch-1.0
or Gst.parse_launch("pipeline")
.
v4l2Source.link(filter)
filter.link(videoconvert)
videoconvert.link(encoder)
encoder.link(parser)
'''
sinkpad = muxer.get_request_pad("video_0")
if not sinkpad:
sys.stderr.write(" Unable to get the sink pad of streammux \n")
srcpad = parser.get_static_pad("src")
if not srcpad:
sys.stderr.write(" Unable to get source pad of caps_vidconvsrc \n")
srcpad.link(sinkpad)
'''
parser.link(muxer)
muxer.link(filesink)
print("All elements linked")
And at last, the execution and stop of the pipeline. It records for around 20 seconds of footage before stopping the record.
player.set_state(Gst.State.PLAYING)
print("Recording")
time.sleep(20)
print('Sending EOS')
player.send_event(Gst.Event.new_eos())
print('Waiting for EOS')
@mo_oises
Your code looks good! Just one thing left to make the GStreamer pipeline work using the Gi Repository in Python3, as per your request. You need to consider waiting for the EOS (End of Stream) signal from the GStreamer bus before closing the application. This can be achieved using the timed_pop_filtered(timeout, types) method of the GStreamer bus.
For instance, appending this to your code resolves the issue:
Cheers, -- Daniel