Problem reading audio file recorded with AudioKit into an AVPlayer intsance

714 views Asked by At

I'm in a situation where I need to export an audio file recorded with AudioKit, and then reimport it later for use some processing via AVAudioPCMBuffer.

Below is the code I'm using for exporting from AudioKit:

    tape = recorder.audioFile!
    player.load(audioFile: tape)


    if let _ = player.audioFile?.duration {
        recorder.stop()
        tape.exportAsynchronously(name: "TempTestFile",
                                  baseDir: .documents,
                                  exportFormat: .caf) {_, exportError in
                                    if let error = exportError {
                                        AKLog("Export Failed \(error)")
                                    } else {
                                        AKLog("Export succeeded")
                                    }
        }

    }

And here is where I'm trying to read back that audio file into my macOS app later, and fill an AVAudioPCMBuffer (where file is the audio file I'm trying to read):

            let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: Double(sampleRate), channels: AVAudioChannelCount(channels), interleaved: interleaved)

            if let buffer = AVAudioPCMBuffer(pcmFormat: format!, frameCapacity: AVAudioFrameCount(file.length)){
                if ((try? file.read(into: buffer)) != nil) {

                    let arraySize = Int(buffer.frameLength)

                    switch type {
                    case is Double.Type:
                        let doublePointer = UnsafeMutablePointer<Double>.allocate(capacity: arraySize)
                        vDSP_vspdp(buffer.floatChannelData![0], 1, doublePointer, 1, vDSP_Length(arraySize))
                        return Array(UnsafeBufferPointer(start: doublePointer, count:arraySize)) as? [T]
                    case is Float.Type:
                        return Array(UnsafeBufferPointer(start: buffer.floatChannelData![0], count:arraySize)) as? [T]
                    default: return nil
                    }
                }
            }

However, I'm consistently getting an error of the following:

EXCEPTION (-50): "wrong number of buffers"

[avae] AVAEInternal.h:103:_AVAE_CheckNoErr: [AVAudioFile.mm:445:-[AVAudioFile readIntoBuffer:frameCount:error:]: (ExtAudioFileRead(_imp->_extAudioFile, &ioFrames, buffer.mutableAudioBufferList)): error -50

This is regardless of the file format used when exporting and importing the audio.

However, it does work fine if I'm using this .wav file that is read from directly inside the app.

Does anyone have any insights into why I can't seem to read the data from the audio file into an AVPCMBuffer.

1

There are 1 answers

0
Anton On

In this line:

let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: Double(sampleRate), channels: AVAudioChannelCount(channels), interleaved: interleaved)

try settings the properly to interleaved to false and don't forget to treat the resulting data as non-interleaved (please read below for more on this).

For some reason, it works with a non-interleaved buffer but I can't get it to work with the interleaved one :( If somebody knows how to do it, please leave a comment :)

Interleaved vs non-interleaved

This is an excerpt from Mike Ash's blog on working with audio data.

...You may have noticed that most people have two ears. Because of this, sound recorded as two separate streams sounds nicer to most people than a single stream. Conceptually, this audio can be thought of as two separate functions of pressure over time.

To represent stereo sound in data, those two functions have to be represented simultaneously. The most common way is to simply interleave the two channels, so that the first value in a buffer would be the left channel, the second value the right channel, then left again, etc. In memory, it would look like:

LRLRLRLRLRLRLRLRLR

It's also possible to simply use two completely different buffers, which just looks like:

buffer 1: LLLLLLLLLL
buffer 2: RRRRRRRRRR

Deinterleaved data like this can be more convenient to work with, but the interleaved representation is more commonly used simply because it keeps everything in one place.