iOS screen capture RPBroadcastSampleHandler callbacks some bad video frames (yuv nv12) when the screen fps is high?

261 views Asked by At

I'm working on screen capturing and recording on iOS, it's based on the ReplayKit and RPBroadcastSampleHandler, the video and audio frame was callback in processSampleBuffer:withType:. it results good if the screen was still, but when it moves, there will be some bad frames callback in. it's like the frame was splitted.

I simply dump the frame and return, and the dump is async to avoid block. the code was very simple:



extension CVPixelBuffer{
    func asyncDump(_ f:UnsafeMutablePointer<FILE>){
        NSLog("dump a frame.")
        let planeCount = CVPixelBufferGetPlaneCount(self)
        for plane in 0 ..< planeCount {
            let source      = CVPixelBufferGetBaseAddressOfPlane(self, plane)
            let height      = CVPixelBufferGetHeightOfPlane(self, plane)
            let bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(self, plane)
            let planeSize   = height * bytesPerRow

            let rawFrame = malloc(planeSize)
            memcpy(rawFrame, source, planeSize)
            let f = f
            DispatchQueue.global(qos: .default).async{
                fwrite(rawFrame,1, planeSize,f)
                free(rawFrame);
                NSLog("dump a frame done.")
            }
        }
    }
}

    override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
        if (DUMP_YUV){
            if RPSampleBufferType.video != sampleBufferType{
                return
            }
        }
        switch sampleBufferType {
        case RPSampleBufferType.video:


            // Handle video sample buffer
            videoFrames+=1
            if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
                let planeCount = CVPixelBufferGetPlaneCount(imageBuffer)
                let height = UInt32(CVPixelBufferGetHeight(imageBuffer))
                let width =  UInt32(CVPixelBufferGetWidth(imageBuffer))
                printFPS("video-\(width)x\(height)")
                CVPixelBufferLockBaseAddress(imageBuffer, [.readOnly])
                defer { CVPixelBufferUnlockBaseAddress(imageBuffer, [.readOnly]) }
                
                if (DUMP_YUV){
                    if (videoFrames % 150 == 0){// dump 1 in 150 to avoid too much io.
                        if let f = f{
                            imageBuffer.asyncDump(f)
                        }
                    }
                    return
                }
...


one of the bad frames was this, the image should be all white, but its left area was a bit darker: the image was splited

I also found when the bad frame callback in, the callback framerate was very high to 90fps, and there was no method or parameter to configure this frame rate. it's very confused, and very less informations about this on the internet.

0

There are 0 answers