I am using an AVAssetWriterVideoInput to append buffers to a file. I have this code:

 if ( _assetWriter.status == AVAssetWriterStatusWriting ) {
    // If the asset writer status is writing, append sample buffer to its corresponding asset writer input
    if (mediaType == AVMediaTypeVideo) {
      if (_assetWriterVideoInput.readyForMoreMediaData) {
        if (![_assetWriterVideoInput appendSampleBuffer:sampleBuffer]) {
          NSLog(@"error: %@", [_assetWriter.error localizedFailureReason]);
          NSLog(@"error: %@", [_assetWriter.error localizedRecoveryOptions]);
          NSLog(@"error: %@", [_assetWriter.error localizedDescription]);
          NSLog(@"error: %@", [_assetWriter.error domain]);
          NSLog(@"error: %@", [_assetWriter.error userInfo]);
        } else
          NSLog(@"frame saved");
      }
    }

this line

    if (![_assetWriterVideoInput appendSampleBuffer:sampleBuffer]) {

fails with unknown error and code -12738 that, obviously is not found on any documentation, as expected with all apple docs.

Also, I doubt this is an unknown error for the simple reason that there are a lot of codes for unknown errors inside AVFoundation and if the system is picking code -12738 it obviously knows more than saying it is unknown.

Looking at the file being saved, it stays 0 megabytes, because no buffer/frame is saved.

This AVAssetWriterVideoInput was created like this:

  CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(currentFormatDescription);
  NSUInteger numPixels = dimensions.width * dimensions.height;
  NSUInteger bitsPerSecond;

  // Assume that lower-than-SD resolutions are intended for streaming, and use a lower bitrate
    NSUInteger bitsPerPixel = 11.4; // This bitrate matches the quality produced by AVCaptureSessionPresetHigh.

  bitsPerSecond = numPixels * bitsPerPixel;

  NSDictionary *videoCompressionSettings = @{AVVideoCodecKey                  : AVVideoCodecH264,
                                             AVVideoWidthKey                  : @(dimensions.width),
                                             AVVideoHeightKey                 : @(dimensions.height),
                                             AVVideoCompressionPropertiesKey  : @{ AVVideoAverageBitRateKey      : @(bitsPerSecond),
                                                                                   AVVideoMaxKeyFrameIntervalKey : @(30)}  };

  if ([_assetWriter canApplyOutputSettings:videoCompressionSettings forMediaType:AVMediaTypeVideo])
  {
    // Intialize asset writer video input with the above created settings dictionary
    _assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoCompressionSettings];
    _assetWriterVideoInput.expectsMediaDataInRealTime = YES;

and the buffer, when it is appended has the following characteristics:

CMSampleBuffer 0x1009e12a0 retainCount: 1 allocator: 0x1b762cbb8
    invalid = NO
    dataReady = YES
    makeDataReadyCallback = 0x0
    makeDataReadyRefcon = 0x0
    formatDescription = <CMVideoFormatDescription 0x170443210 [0x1b762cbb8]> {
    mediaType:'vide' 
    mediaSubType:'BGRA' 
    mediaSpecific: {
        codecType: 'BGRA'       dimensions: 1920 x 1080 
    } 
    extensions: {<CFBasicHash 0x17087c2c0 [0x1b762cbb8]>{type = immutable dict, count = 2,
entries =>
    0 : <CFString 0x1b1c6d460 [0x1b762cbb8]>{contents = "Version"} = <CFNumber 0xb000000000000022 [0x1b762cbb8]>{value = +2, type = kCFNumberSInt32Type}
    2 : <CFString 0x1b1c6d3e0 [0x1b762cbb8]>{contents = "CVBytesPerRow"} = <CFNumber 0xb00000000001e002 [0x1b762cbb8]>{value = +7680, type = kCFNumberSInt32Type}
}
}
}
    sbufToTrackReadiness = 0x0
    numSamples = 1
    sampleTimingArray[1] = {
        {PTS = {290309939228910/1000000000 = 290309.939}, DTS = {INVALID}, duration = {INVALID}},
    }
    imageBuffer = 0x170321180

I have a sample code here if you want to check the problem. That code is prepared to shoot video in 4K. Change the line AVCaptureSessionPreset3840x2160 to AVCaptureSessionPresetHighinsideProcessadorVideo.m` if your device cannot do that. The sample code crops a rectangle from the video stream and applies a comic effect to it. Thanks

1

There are 1 answers

0
manishg On BEST ANSWER

I tried your sample but couldn't reproduce the error -12783. However I got another error -12780. I used iPhone 6s so not sure if it is due to that.

In any case, I could fix the error -12780. You are facing this issue due to old timestamp. I added some logs in your app for debugging purpose. See below

2017-01-26 15:00:35.809590 NotWriting[16829:3116125] starting at 401051199680537 2017-01-26 15:00:35.809986 NotWriting[16829:3116125] appending video buffer with timestamp 401051199680537 2017-01-26 15:00:35.810008 NotWriting[16829:3116125] presentationTimeStamp is less than last frame timestamp, So probably will fail 2017-01-26 15:00:35.815605 NotWriting[16829:3116125] error: An unknown error occurred (-12780)

When you are writing a buffer, you have to make sure the presentation timestamp is greater than the last frame you wrote otherwise the writing will fail. You can add a check with the following logic:

CMTime  presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
      if(CMTIME_COMPARE_INLINE(presentationTimeStamp, <=, lastpresentationTimeStamp))
      {
          NSLog(@"presentationTimeStamp is less than last frame timestamp, So rejecting frame");
          return;
      }
      lastpresentationTimeStamp = presentationTimeStamp;