DirectShow.NET retrieving IAMDroppedFrames interface from filtergraph

249 views Asked by At

I am using the DirectShow.NET library in C# to write a program which captures frames from an ATI AllInWonder video card in order to convert old VHS tapes to digital. The video card is old and uses the AGP port. I only have two computers with this port and the CPU is too slow on them to capture the video signal and encode it to a video file. I took the DxLogo example from the DirectShow.NET examples and modified it to just dump the raw frame data to a file to be parsed later.

My filtergraph is made up of a source filter connected to a smart-tee which the captureGraphBuilder added automatically. Next is a sampleGrabber connected to a null filter. The sampleGrabber callback just dumps the RGB24 frame data to an output file.

I am able to record a video signal and convert it to an AVI file just fine. However, I am only getting a framerate of about 15FPS even though I have the source filter configured to 30FPS. I know my card is capable of 30FPS as the original software has this option. My CPU is under about 60% load so I don't think it is causing a problem.

I would like to use the IAMDroppedFrames interface on my capture filter to see if I am dropping frames or if my problem is elsewhere. I am able to get the dropped frames interface and get data from it in my graphbuilder function, however, since the references to the filters are released after the graph is built, I lose access to the droppedframes interface.

I have tried using the EnumFilters and EnumPins interfaces with frustrating results. I can successfully retrieve the IBaseFilter objects for all of my filters and I can get the name of my source filter which is "Logitech Webcam" (which is what I am using to test my code so that I can write the code on my main machine instead of in visual express 2010 on Windows XP). When I run the EnumPins method and then call pins.Next on the resulting interface, I get the error "System.InvalidCastException: 'Unable to cast object of type 'System.__ComObject' to type 'DirectShowLib.IPin'.'" For some reason the pins.Next method is returning a ComObject instead of an IPin interface. If I use the exact same code on the smart-tee filter I am able to retrieve the IPin objects and access the interfaces on the input and output pins.

So basically I am looking for a way to get the IAMDroppedFrames count while my capture is running and the only way I know of doing that is using the EnumFilters and EnumPins methods. However, the one filter that I need to get the pins from, I can't, but I am able to access the pins on any of the other filters.

How can I get a running count of the dropped frames displayed on my form while the capture is running?

Code for getting the IPin references:

IEnumFilters filters = null;
m_FilterGraph.EnumFilters(out filters);
IntPtr numFilters = new IntPtr();
IBaseFilter[] filtersFound = new IBaseFilter[1];
filters.Skip(3);
filters.Next(1, filtersFound, numFilters);
FilterInfo info;
filtersFound[0].QueryFilterInfo(out info);
System.Diagnostics.Debug.WriteLine(info.achName);
IEnumPins pins = null;
filtersFound[0].EnumPins(out pins);
IntPtr numPins = new IntPtr();
IPin[] pinsFound = new IPin[1];
pins.Next(1, pinsFound, numPins);

Full Code:

/****************************************************************************
While the underlying libraries are covered by LGPL, this sample is released 
as public domain.  It is distributed in the hope that it will be useful, but 
WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY 
or FITNESS FOR A PARTICULAR PURPOSE.  
*****************************************************************************/

using System;
using System.Drawing;
using System.Drawing.Imaging;
using System.Runtime.InteropServices;
using System.Diagnostics;
using System.IO;

using DirectShowLib;
using System.ComponentModel;

namespace DxLogo
{
    /// <summary> Summary description for MainForm. </summary>
    internal class Capture : ISampleGrabberCB, IDisposable
    {
        //static long milliseconds = DateTime.Now.Ticks / TimeSpan.TicksPerMillisecond;
        //static bool written = false;
        #region Member variables

        /// <summary> graph builder interface. </summary>
        private IFilterGraph2 m_FilterGraph = null;
        IMediaControl m_mediaCtrl = null;

        /// <summary> Set by async routine when it captures an image </summary>
        private bool m_bRunning = false;

        /// <summary> Dimensions of the image, calculated once in constructor. </summary>
        private int m_videoWidth;
        private int m_videoHeight;
        private int m_stride;

        FileStream file = File.Open("C:\\Users\\Admin\\Desktop\\CapturedFrames\\output", FileMode.OpenOrCreate, FileAccess.Write);

        BitmapData m_bmdLogo = null;
        Bitmap m_Bitmap = null;

#if DEBUG
        // Allow you to "Connect to remote graph" from GraphEdit
        DsROTEntry m_rot = null;
        
#endif

        #endregion

        #region API

        [DllImport("Kernel32.dll", EntryPoint="RtlMoveMemory")]
        private static extern void CopyMemory(IntPtr Destination, IntPtr Source, [MarshalAs(UnmanagedType.U4)] uint Length);

        #endregion

        /// zero based device index, and some device parms, plus the file name to save to
        public Capture(int iDeviceNum, int iFrameRate, int iWidth, int iHeight, string FileName)
        {
            DsDevice[] capDevices;

            // Get the collection of video devices
            capDevices = DsDevice.GetDevicesOfCat(FilterCategory.VideoInputDevice);

            if (iDeviceNum + 1 > capDevices.Length)
            {
                throw new Exception("No video capture devices found at that index!");
            }

            try
            {
                // Set up the capture graph
                SetupGraph( capDevices[iDeviceNum], iFrameRate, iWidth, iHeight, FileName);
            }
            catch
            {
                Dispose();
                throw;
            }
        }

        /// <summary> release everything. </summary>
        public void Dispose()
        {
            CloseInterfaces();
            if (m_Bitmap != null)
            {
                m_Bitmap.UnlockBits(m_bmdLogo);
                m_Bitmap = null;
                m_bmdLogo = null;
            }
        }
        // Destructor
        ~Capture()
        {
            CloseInterfaces();
        }


        /// <summary> capture the next image </summary>
        public void Start()
        {
            if (!m_bRunning)
            {
                int hr = m_mediaCtrl.Run();
                DsError.ThrowExceptionForHR( hr );

                m_bRunning = true;
            }
        }
        // Pause the capture graph.
        // Running the graph takes up a lot of resources.  Pause it when it
        // isn't needed.
        public void Pause()
        {
            if (m_bRunning)
            {
                int hr = m_mediaCtrl.Pause();
                DsError.ThrowExceptionForHR( hr );

                m_bRunning = false;
            }
        }

        /// <summary> build the capture graph for grabber. </summary>
        private void SetupGraph(DsDevice dev, int iFrameRate, int iWidth, int iHeight, string FileName)
        {
            int hr;

            ISampleGrabber sampGrabber = null;
            IBaseFilter baseGrabFlt = null;
            IBaseFilter capFilter = null;
            //IBaseFilter muxFilter = null;
            IBaseFilter nullFilter = null;
            IFileSinkFilter fileWriterFilter = null;
            ICaptureGraphBuilder2 capGraph = null;

            // Get the graphbuilder object
            m_FilterGraph = new FilterGraph() as IFilterGraph2;
            m_mediaCtrl = m_FilterGraph as IMediaControl;

#if DEBUG
            m_rot = new DsROTEntry(m_FilterGraph);
#endif
            try
            {
                // Get the ICaptureGraphBuilder2
                capGraph = (ICaptureGraphBuilder2) new CaptureGraphBuilder2();

                // Get the SampleGrabber interface
                sampGrabber = (ISampleGrabber) new SampleGrabber();

                // Start building the graph
                hr = capGraph.SetFiltergraph( m_FilterGraph );
                DsError.ThrowExceptionForHR( hr );

                // Add the video device
                hr = m_FilterGraph.AddSourceFilterForMoniker(dev.Mon, null, dev.Name, out capFilter);
                DsError.ThrowExceptionForHR( hr );

                baseGrabFlt = (IBaseFilter) sampGrabber;
                ConfigureSampleGrabber(sampGrabber);

                // Add the frame grabber to the graph
                hr = m_FilterGraph.AddFilter( baseGrabFlt, "Ds.NET Grabber" );
                DsError.ThrowExceptionForHR( hr );

                Type type = Type.GetTypeFromCLSID(new Guid("C1F400A4-3F08-11d3-9F0B-006008039E37"));
                nullFilter = (IBaseFilter)Activator.CreateInstance(type);

                hr = m_FilterGraph.AddFilter(nullFilter, "Null Filter");
                DsError.ThrowExceptionForHR(hr);

                
                // If any of the default config items are set
                if (iFrameRate + iHeight + iWidth > 0)
                {
                    SetConfigParms(capGraph, capFilter, iFrameRate, iWidth, iHeight);
                }
                /*
                // Create a filter for the output avi file
                hr = capGraph.SetOutputFileName(MediaSubType.Avi, FileName, out muxFilter, out fileWriterFilter);
                DsError.ThrowExceptionForHR( hr );
                */
                // Connect everything together
                hr = capGraph.RenderStream( PinCategory.Capture, MediaType.Video, capFilter, baseGrabFlt,  nullFilter);
                DsError.ThrowExceptionForHR( hr );


                object droppedFramesObject;
                capGraph.FindInterface(FindDirection.DownstreamOnly, null, capFilter, typeof(IAMDroppedFrames).GUID, out droppedFramesObject);
                IAMDroppedFrames droppedFrames = droppedFramesObject as IAMDroppedFrames;
                int numDropped, numNotDropped;
                droppedFrames.GetNumDropped(out numDropped);
                droppedFrames.GetNumNotDropped(out numNotDropped);
                System.Diagnostics.Debug.WriteLine("Dropped Frames");
                System.Diagnostics.Debug.WriteLine(numDropped);
                System.Diagnostics.Debug.WriteLine(numNotDropped);

                

                // Now that sizes are fixed, store the sizes
                SaveSizeInfo(sampGrabber);
            }
            finally
            {
                if (fileWriterFilter != null)
                {
                    Marshal.ReleaseComObject(fileWriterFilter);
                    fileWriterFilter = null;
                }
                if (nullFilter != null)
                {
                    Marshal.ReleaseComObject(nullFilter);
                    nullFilter = null;
                }
                if (capFilter != null)
                {
                    Marshal.ReleaseComObject(capFilter);
                    capFilter = null;
                }
                if (sampGrabber != null)
                {
                    Marshal.ReleaseComObject(sampGrabber);
                    sampGrabber = null;
                }
            }
        }

        /// <summary> Read and store the properties </summary>
        private void SaveSizeInfo(ISampleGrabber sampGrabber)
        {
            int hr;

            // Get the media type from the SampleGrabber
            AMMediaType media = new AMMediaType();
            hr = sampGrabber.GetConnectedMediaType( media );
            DsError.ThrowExceptionForHR( hr );

            if( (media.formatType != FormatType.VideoInfo) || (media.formatPtr == IntPtr.Zero) )
            {
                throw new NotSupportedException( "Unknown Grabber Media Format" );
            }

            // Grab the size info
            VideoInfoHeader videoInfoHeader = (VideoInfoHeader) Marshal.PtrToStructure( media.formatPtr, typeof(VideoInfoHeader) );
            m_videoWidth = videoInfoHeader.BmiHeader.Width;
            m_videoHeight = videoInfoHeader.BmiHeader.Height;
            m_stride = m_videoWidth * (videoInfoHeader.BmiHeader.BitCount / 8);

            DsUtils.FreeAMMediaType(media);
            media = null;
        }
        /// <summary> Set the options on the sample grabber </summary>
        private void ConfigureSampleGrabber(ISampleGrabber sampGrabber)
        {
            int hr;
            AMMediaType media = new AMMediaType();

            // Set the media type to Video/RBG24
            media.majorType = MediaType.Video;
            media.subType = MediaSubType.RGB24;
            media.formatType = FormatType.VideoInfo;
            hr = sampGrabber.SetMediaType( media );
            DsError.ThrowExceptionForHR( hr );

            DsUtils.FreeAMMediaType(media);
            media = null;

            // Configure the samplegrabber callback
            hr = sampGrabber.SetCallback( this, 1 );
            DsError.ThrowExceptionForHR( hr );

        }

        // Set the Framerate, and video size
        private void SetConfigParms(ICaptureGraphBuilder2 capGraph, IBaseFilter capFilter, int iFrameRate, int iWidth, int iHeight)
        {
            int hr;
            object o;
            AMMediaType media;
            IAMStreamConfig videoStreamConfig;
            IAMVideoControl videoControl = capFilter as IAMVideoControl;

            // Find the stream config interface
            hr = capGraph.FindInterface(
                PinCategory.Capture, MediaType.Video, capFilter, typeof(IAMStreamConfig).GUID, out o );

            videoStreamConfig = o as IAMStreamConfig;
            try
            {
                if (videoStreamConfig == null)
                {
                    throw new Exception("Failed to get IAMStreamConfig");
                }

                hr = videoStreamConfig.GetFormat(out media);
                DsError.ThrowExceptionForHR( hr );

                // copy out the videoinfoheader
                VideoInfoHeader v = new VideoInfoHeader();
                Marshal.PtrToStructure( media.formatPtr, v );

                // if overriding the framerate, set the frame rate
                if (iFrameRate > 0)
                {
                    v.AvgTimePerFrame = 10000000 / iFrameRate;
                }

                // if overriding the width, set the width
                if (iWidth > 0)
                {
                    v.BmiHeader.Width = iWidth;
                }

                // if overriding the Height, set the Height
                if (iHeight > 0)
                {
                    v.BmiHeader.Height = iHeight;
                }

                // Copy the media structure back
                Marshal.StructureToPtr( v, media.formatPtr, false );

                // Set the new format
                hr = videoStreamConfig.SetFormat( media );
                DsError.ThrowExceptionForHR( hr );

                DsUtils.FreeAMMediaType(media);
                media = null;

                // Fix upsidedown video
                if (videoControl != null)
                {
                    VideoControlFlags pCapsFlags;

                    IPin pPin = DsFindPin.ByCategory(capFilter, PinCategory.Capture, 0);
                    hr = videoControl.GetCaps(pPin, out pCapsFlags);
                    DsError.ThrowExceptionForHR( hr );

                    if ((pCapsFlags & VideoControlFlags.FlipVertical) > 0)
                    {
                        hr = videoControl.GetMode(pPin, out pCapsFlags);
                        DsError.ThrowExceptionForHR( hr );

                        hr = videoControl.SetMode(pPin, 0);
                    }
                }
            }
            finally
            {
                Marshal.ReleaseComObject(videoStreamConfig);
            }
        }

        /// <summary> Shut down capture </summary>
        private void CloseInterfaces()
        {
            int hr;

            try
            {
                if( m_mediaCtrl != null )
                {
                    // Stop the graph
                    hr = m_mediaCtrl.Stop();
                    m_mediaCtrl = null;
                    m_bRunning = false;
                }
            }
            catch (Exception ex)
            {
                Debug.WriteLine(ex);
            }

#if DEBUG
            if (m_rot != null)
            {
                m_rot.Dispose();
            }
#endif

            if (m_FilterGraph != null)
            {
                Marshal.ReleaseComObject(m_FilterGraph);
                m_FilterGraph = null;
            }
            GC.Collect();
        }


        /// <summary> sample callback, NOT USED. </summary>
        int ISampleGrabberCB.SampleCB( double SampleTime, IMediaSample pSample )
        {
            Marshal.ReleaseComObject(pSample);
            return 0;
        }

        /// <summary> buffer callback, COULD BE FROM FOREIGN THREAD. </summary>
        int ISampleGrabberCB.BufferCB( double SampleTime, IntPtr pBuffer, int BufferLen )
        {
            //System.Diagnostics.Debug.WriteLine(m_videoWidth);
            //System.Diagnostics.Debug.WriteLine(m_videoHeight);
            Byte[] byteArray = new byte[BufferLen];
            Marshal.Copy(pBuffer, byteArray, 0, BufferLen);
            file.Write(byteArray, 0, BufferLen);




            IEnumFilters filters = null;
            m_FilterGraph.EnumFilters(out filters);
            IntPtr numFilters = new IntPtr();
            IBaseFilter[] filtersFound = new IBaseFilter[1];
            filters.Skip(3);
            filters.Next(1, filtersFound, numFilters);
            FilterInfo info;
            filtersFound[0].QueryFilterInfo(out info);
            System.Diagnostics.Debug.WriteLine(info.achName);
            IEnumPins pins = null;
            filtersFound[0].EnumPins(out pins);
            IntPtr numPins = new IntPtr();
            IPin[] pinsFound = new IPin[1];
            pins.Next(1, pinsFound, numPins);

            return 0;
        }
    }
}

0

There are 0 answers