I would like to periodically save a snapshot from a video stream using ffmpeg. This is is possible using the following command:
ffmpeg -i [source] -r 0.2 -f image2 ./image-%04d.jpeg
or the (rough) equivalent in ffmpge-python:
ffmpeg.input(source, **inputArgs)
.filter("fps", fps=1 / 5, round="up")
.output("./image-%04d.jpeg")
I would like to do this for N input streams simultaneously. So, I would be saving 1 frame every 5 seconds for each of the input streams. Is this possible using ffmpeg or ffmpeg-python? I can't find any examples with multiple input streams.
Just for some more information: the sources are all rtsp streams and N can be between 10 and 60. I also would like to limit the number of files on disk at any one time (similar to G-streamer's max-files property) but this is less important.
One approach would be to use
run_async()
. From the docs:https://kkroening.github.io/ffmpeg-python/index.html#ffmpeg.run_async
This creates a subprocess which runs FFMPEG concurrently with your Python program. You can then call
run_async()
multiple times.Danger note: if you use
pipe_stdin
,pipe_stdout
, orpipe_stderr
, be aware that this will cause your FFMPEG processes to freeze if you do not callprocess.communicate()
on each subprocess occasionally to drain their output pipes.