How to convert AudioBufferList containing AAC data to CMSampleBuffer

908 views Asked by At

I'm using AudioConverter to convert uncompressed CMSampleBuffer being captured via AVCaptureSession to AudioBufferList:

let status: OSStatus = AudioConverterFillComplexBuffer(
            converter,
            inputDataProc,
            Unmanaged.passUnretained(self).toOpaque(),
            &ioOutputDataPacketSize,
            outOutputData.unsafeMutablePointer,
            nil
        )

My output asbd is set up as follows:

AudioStreamBasicDescription
- mSampleRate : 44100.0
- mFormatID : 1633772320
- mFormatFlags : 2
- mBytesPerPacket : 0
- mFramesPerPacket : 1024
- mBytesPerFrame : 0
- mChannelsPerFrame : 1
- mBitsPerChannel : 0
- mReserved : 0

I'd like to convert AudioBufferList back to a CMSampleBuffer containing compressed data so that I can then write it to an mp4 file using AVAssetWriter (I have already figured out how to do it with video), but so far with little. I've tried consulting this answer but in that case there's PCM data and it doesn't seem to be usable here.

I have access to AudioBufferList as well as the presentationTimeStamp of the original sample. I've tried the following but I'm not quite sure how to calculate numSamples and whether this approach makes any sense or not:

 func createCMSampleBuffer(_ data: UnsafeMutableAudioBufferListPointer, presentationTimeStamp: CMTime) -> CMSampleBuffer? {
    let numSamples = // not sure how to get this

    var status: OSStatus = noErr
    var sampleBuffer: CMSampleBuffer?
    var timing: CMSampleTimingInfo = CMSampleTimingInfo(
        duration: CMTime(value: CMTimeValue(numSamples), timescale: presentationTimeStamp.timescale),
        presentationTimeStamp: presentationTimeStamp,
        decodeTimeStamp: CMTime.invalid
    )

    status = CMSampleBufferCreate(
        allocator: kCFAllocatorDefault,
        dataBuffer: nil,
        dataReady: false,
        makeDataReadyCallback: nil,
        refcon: nil,
        formatDescription: formatDescription,
        sampleCount: CMItemCount(numSamples),
        sampleTimingEntryCount: 1,
        sampleTimingArray: &timing,
        sampleSizeEntryCount: 0,
        sampleSizeArray: nil,
        sampleBufferOut: &sampleBuffer
    )

    guard status == noErr else {
        return nil
    }

    status = CMSampleBufferSetDataBufferFromAudioBufferList(
        sampleBuffer!,
        blockBufferAllocator: kCFAllocatorDefault,
        blockBufferMemoryAllocator: kCFAllocatorDefault,
        flags: 0,
        bufferList: data.unsafePointer
    )

    guard status == noErr else {
        return nil
    }

    return sampleBuffer
}

In the end I did manage to create a CMSampleBuffer, but when I try finishing writing, I get the following error:

Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSUnderlyingError=0x174442ac0 {Error Domain=NSOSStatusErrorDomain Code=-12735 "(null)"}, NSLocalizedFailureReason=An unknown error occurred (-12735), NSLocalizedDescription=The operation could not be completed}
2

There are 2 answers

1
Shivam Parmar On

I can share some research work on this may it will help you ...

CMSampleBufferSetDataBufferFromAudioBufferList returned error: -12731

solution :- https://lists.apple.com/archives/coreaudio-api/2014/Mar/msg00008.html

or also Converting AudioBuffer to CMSampleBuffer with accurate CMTime

may it will help you ...:)

0
Grzegorz Aperliński On

So, I've managed to get some progress done (but still far from making everything work). Instead of constructing CMSampleBuffer as above, I instead managed to make the following (somewhat) work:

CMAudioSampleBufferCreateWithPacketDescriptions(
        allocator: kCFAllocatorDefault,
        dataBuffer: nil,
        dataReady: false,
        makeDataReadyCallback: nil,
        refcon: nil,
        formatDescription: formatDescription!,
        sampleCount: Int(data.unsafePointer.pointee.mNumberBuffers),
        presentationTimeStamp: presentationTimeStamp,
        packetDescriptions: &packetDescriptions,
        sampleBufferOut: &sampleBuffer)

The key here is to obtain the packetDescriptions from the compression process:

let packetDescriptionsPtr = UnsafeMutablePointer<AudioStreamPacketDescription>.allocate(capacity: 1)

AudioConverterFillComplexBuffer(
                converter,
                inputDataProc,
                Unmanaged.passUnretained(self).toOpaque(),
                &ioOutputDataPacketSize,
                outOutputData.unsafeMutablePointer,
                packetDescriptionsPtr
            )

The audio CMSampleBuffer seems to be correctly created now but when I append it, audio doesn't play and it creates weird timing glitches with video.