I am making a C++ application in which I have a bunch of frames (in unsigned char* format) and I need to encode them as a video using gstreamer H265 encoder, running on GPU. Most of the gstreamer samples are working with camera directly, but in my case there is no camera.
Using some samples I made the video encoder but the frames doesn't get pushed to the video file and the output video is empty.
Here is the code I implemented:
GstElement *pipeline, *appsrc, *videoconvert, *x264enc, *mp4mux, *filesink, *autovideosink;
GstCaps *caps;
GstBuffer *buf;
GstMapInfo map;
gst_init(nullptr, nullptr);
pipeline = gst_pipeline_new("mypipeline");
// Create elements
appsrc = gst_element_factory_make("appsrc", "mysource");
videoconvert = gst_element_factory_make("videoconvert", "myconvert");
x264enc = gst_element_factory_make("x264enc", "myencoder");
mp4mux = gst_element_factory_make("mp4mux", "mymux");
filesink = gst_element_factory_make("filesink", "myfileoutput");
if (!pipeline || !appsrc || !videoconvert || !x264enc || !mp4mux || !filesink) {
g_printerr("Not all elements could be created.\n");
// return -1;
}
// Set the properties for filesink
g_object_set(filesink, "location", "output.mp4", NULL);
// Build the pipeline
gst_bin_add(GST_BIN(pipeline), appsrc);
gst_bin_add(GST_BIN(pipeline), videoconvert);
gst_bin_add(GST_BIN(pipeline), x264enc);
gst_bin_add(GST_BIN(pipeline), mp4mux);
gst_bin_add(GST_BIN(pipeline), filesink);
// Link the elements
gst_element_link(appsrc, videoconvert);
gst_element_link(videoconvert, x264enc);
gst_element_link(x264enc, mp4mux);
gst_element_link(mp4mux, filesink);
caps = gst_caps_from_string("video/x-raw, format=(string)BGR, width=(int)800, height=(int)600, framerate=(fraction)30/1");
gst_element_set_state(pipeline, GST_STATE_PLAYING);
for (int i = 0; i < 10; i++) {
buf = gst_buffer_new_and_alloc(800 * 600 * 3); // Assuming BGR format
gst_buffer_map(buf, &map, GST_MAP_WRITE);
memset(map.data, i, 800 * 600 * 3); // Filling with dummy data
gst_buffer_unmap(buf, &map);
gst_app_src_push_buffer(GST_APP_SRC(appsrc), buf);
}
gst_app_src_end_of_stream(GST_APP_SRC(appsrc));
GstBus *bus = gst_element_get_bus(pipeline);
GstMessage *msg = gst_bus_timed_pop(bus, GST_CLOCK_TIME_NONE);
if (msg != NULL)
gst_message_unref(msg);
gst_object_unref(bus);
gst_element_set_state(pipeline, GST_STATE_NULL);
gst_object_unref(pipeline);
It seems that the command gst_app_src_push_buffer is doing nothing, which I don't have any clue why. Is there any mistake here?
You have a couple of issues in your code:
do-timestampproperty (to valuetrue) in order to do the auto-timestamping for you but you cannot use it as you push buffers to the pipeline without waiting for the pipeline to reach the playing state first. Alternatively you could set additionally theappsrc'sis-liveproperty totrueto letappsrcpush the buffers downstream once the playing state is reached, however the auto-timestamped buffers will be played with much bigger framerate than configured (30FPS). If you wanted 30FPS then something like the following would set the buffer timestamps properly:appsrc'sformatproperty should be set to theGST_FORMAT_TIMEvalue.appsrcelementThe following fixed version works fine:
Compiling, running and testing (on macos):
I see that you wrote that code in C++. Please also consider using RAII technique to avoid maintaining resources manually. A simple RAII wrapper may help, simplify the code and make it less error prone. For instance:
Example usage: