In my Python program I use GStreamer's playbin in combination with a textoverlay to play a video file and show some text on top of it.
This works fine: If I change the text property of the textoverlay then the new text is shown.
But now I want to set the text based on the video's current position/time (like subtitles).
I read about a pipeline's clock, buffer's timestamps, segment-events and external timers which query the current time every x millisecs. But what is the best practice to get informed about time-changes so that I can show the correct text as soon as possible?
The best way to do it really synchronized with the video would be to use something like the cairooverlay element and do the rendering yourself directly inside the pipeline, based on the actual timestamps of the frames. Or alternatively write your own element for doing that.
The easiest solution if timing is not needed to be super accurate would be to use the pipeline clock. You can get it from the pipeline once it started, and then could create single shot (or periodic) clock ids for the time or interval you want. And then use the async_wait() method on the clock.
To get the clock time that corresponds to e.g. the position 1 second of the pipeline you would add 1 second (i.e. 1000000000) to the pipeline's base time. You can use that value then when creating the clock ids.