I am attempting to create a DirectShow source filter based on the pushsource example from the DirectShow SDK. My source filter essentially outputs a set of bitmaps, each of which typically lasts around 600 milliseconds, to a video. I have set up a filter graph which uses Async_reader with a Wave Parser for audio and my new filter to push the video (the filter is a CSourceStream and I populate my frames in the FillBuffer function). These are both connected to a WMASFWriter to output a WMV.
What I am finding is that at times corresponding the keyframe interval the video pauses for a second or two. This seems to be worse at HD resolutions, thought that could be a red herring. What kinds of things might be causing this? Is it related to how often I allow FillBuffer to be called (my frame rate is 30fps, so I end up with the same bitmap being repeated for several frames)?
A common sense (as opposed to specific experience with effect in question) suggests that encoding of a key frame consumes too much of bandwidth, and the remainder is insufficient to encode movement of the segment which immediately follows the key frame. I assume this is CBR mode, and perhaps VBR mode could give a better encoding.