how to stream capture image frame to rtmp server using ffmpeg C#

1.4k views Asked by At

I'm working on a task where I need to capture the stream from the IP camera and then I need to send the capture frame to the RTMP server.

for capturing the image frame from the camera I'm using the OpenCV C# wrapper of Emgu CV.

For sending the frame to RTMP server I'm using the ffmpeg, but it's working fine. Can anybody help me with how I can achieve this task.

  private void ProcessFrame(object sender, EventArgs e)
        {
            try
            {
                if (_capture != null && _capture.Ptr != IntPtr.Zero)
                {
                    _capture.Retrieve(_frame, 0);
                  
                    if (_capture.IsOpened)
                    {
                        string imagename = "image" + count + ".png";
                        _frame.Save(@"C:\\Staging\\Jacob\\CameraImage\\"+ imagename);
                        var myBinary = ImageToByteArray(_frame.ToBitmap());
                        if (ffMpegTask != null)
                        {
                            ffMpegTask.Write(myBinary, 0, myBinary.Length);
                            if ((DateTime.UtcNow - starttime).TotalSeconds > 5)
                            {
                                ffMpegTask.Write(myBinary, 0, myBinary.Length);
                                starttime = DateTime.UtcNow;
                            }

                        }
                        else
                        {
                            ffMpegg = new NReco.VideoConverter.FFMpegConverter();
                            ffMpegTask = ffMpegg.ConvertLiveMedia(
                             null,
                             "rawvideo",
                             "rtmp://localhost/live/abcd",
                             Format.flv,
                             new ConvertSettings()
                             {
                                 //CustomInputArgs = String.Format(" -pix_fmt bgr24 -video_size 640x360 -framerate 5 "),
                                 //CustomInputArgs = String.Format(" -pix_fmt bgr24 -video_size 640x480 -framerate 5 ",frameBmp.Width, frameBmp.Height)
                                 CustomInputArgs = String.Format("-y -an -f rawvideo -vcodec rawvideo -pix_fmt bgr24 -video_size 640x480 -framerate 5 -r 5", _frame.Width, _frame.Height),
                                 //CustomOutputArgs = "-c:v libx264 -pix_fmt yuv420p -preset ultrafast -f flv",
                                 CustomOutputArgs = "-c:v libx264 -pix_fmt yuv420p -preset ultrafast -f flv",
                             }

                        );
                            ffMpegTask.Start();
                        }
                    }
                    else
                    {
                        process.Close();
                    }



                }
            }
            catch (Exception ex)
            {
                MessageBox.Show(ex.Message);
            }

        }

In the above code I'm continuously getting the image frame, now I need to send this frame to the rtmp server continuously for live streaming.

1

There are 1 answers

0
Winlin On

Do you want to get the image from IP camera, then process the frame by OpenCV, finally encode by FFmpeg and send to media server by FFmpeg? Your workflow is like this:

IP Camera --YUV---> OpenCV ---Frame--> FFmpeg --RTMP---> Media Server

There is another workflow, which is might much simple, to use FFmpeg to pull stream from your IP Camera directly, then process the RTMP stream by OpenCV, like this:

IP Camera -RTSP-> FFmpeg -> Media Server -> OpenCV/FFmpeg -> Media Server

For example:

  1. Use FFmpeg to pull RTSP from IP Camera and push RTMP to Media server.
  2. Use OpenCV/FFmpeg to pull RTMP from media server and process it.
  3. After processed the RTMP stream, push back another new RTMP stream to Media Server.

All components use common RTSP/RTMP protocol in pipeline.