Examples of ffmpeg that I've seen so far all seem to accept a file on disk storage as input, to transcode it into another file in disk storage as output. Also i've come accross ffserver which can be used to stream-out video. However, yet to find a good tutorial or example of ffmpeg used to transcode streaming video/audio, constrained by say parameters like running-time or no. of frames or other event, and save the transcoded media on disk.
Any pointers, tips or hints would really help.
After significant research, I've come to the conclusion that Gstreamer is the ideal mechanism (a framework with some tools & libraries) to do this. It allows me to do pretty much all I want from "transcoding" activities (framerate control, re-encode, framesize modifications etc.), and also allows me to restream and also store to disk.
While the framework expects this to be done programmaticly a set of command line tools also allows one to create transformation pipelines, which are quite intuitive. There is decent documentation, although there is definitely scope for much improvement. Best part is, it allows one to invoke several 3rd party libraries as plugins, for instance ffmpeg and effects plugins for audio & video.