I am having trouble interpreting the behavior of the remoteIO audiounit callbacks in iOS. I am setting up a remoteIO unit with two callbacks, one as in input callback and one as an "render" callback. I am following a very similar remoteIO setup as the one recommended in this tasty pixel tutorial. This is the rather length setup method:
- (void)setup {
AudioUnit ioUnit;
AudioComponentDescription audioCompDesc;
audioCompDesc.componentType = kAudioUnitType_Output;
audioCompDesc.componentSubType = kAudioUnitSubType_RemoteIO;
audioCompDesc.componentManufacturer = kAudioUnitManufacturer_Apple;
audioCompDesc.componentFlags = 0;
audioCompDesc.componentFlagsMask = 0;
AudioComponent rioComponent = AudioComponentFindNext(NULL, &audioCompDesc);
CheckError(AudioComponentInstanceNew(rioComponent, &ioUnit), "Couldn't get RIO unit instance");
// i/o
UInt32 oneFlag = 1;
CheckError(AudioUnitSetProperty(ioUnit,
kAudioOutputUnitProperty_EnableIO,
kAudioUnitScope_Output,
kOutputBus,
&oneFlag,
sizeof(oneFlag)), "Couldn't enable RIO output");
CheckError(AudioUnitSetProperty(ioUnit,
kAudioOutputUnitProperty_EnableIO,
kAudioUnitScope_Input,
kInputBus,
&oneFlag,
sizeof(oneFlag)), "Couldn't enable RIO input");
AudioStreamBasicDescription myASBD;
memset (&myASBD, 0, sizeof(myASBD));
myASBD.mSampleRate = 44100;
myASBD.mFormatID = kAudioFormatLinearPCM;
myASBD.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
myASBD.mFramesPerPacket = 1;
myASBD.mChannelsPerFrame = 1;
myASBD.mBitsPerChannel = 16;
myASBD.mBytesPerPacket = 2 * myASBD.mChannelsPerFrame;
myASBD.mBytesPerFrame = 2 * myASBD.mChannelsPerFrame;
// set stream format for both busses
CheckError(AudioUnitSetProperty(ioUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Input,
kOutputBus,
&myASBD,
sizeof(myASBD)), "Couldn't set ASBD for RIO on input scope / bus 0");
CheckError(AudioUnitSetProperty(ioUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Output,
kInputBus,
&myASBD,
sizeof(myASBD)), "Couldn't set ASBD for RIO on output scope / bus 1");
// set arbitrarily high for now
UInt32 bufferSizeBytes = 10000 * sizeof(int);
int offset = offsetof(AudioBufferList, mBuffers[0]);
int bufferListSizeInBytes = offset + (sizeof(AudioBuffer) * myASBD.mChannelsPerFrame);
// why need to cast to audioBufferList * ?
self.inputBuffer = (AudioBufferList *)malloc(bufferListSizeInBytes);
self.inputBuffer->mNumberBuffers = myASBD.mChannelsPerFrame;
for (UInt32 i = 0; i < myASBD.mChannelsPerFrame; i++) {
self.inputBuffer->mBuffers[i].mNumberChannels = 1;
self.inputBuffer->mBuffers[i].mDataByteSize = bufferSizeBytes;
self.inputBuffer->mBuffers[i].mData = malloc(bufferSizeBytes);
}
self.remoteIOUnit = ioUnit;
/////////////////////////////////////////////// callback setup
AURenderCallbackStruct callbackStruct;
callbackStruct.inputProc = inputCallback;
callbackStruct.inputProcRefCon = (__bridge void * _Nullable)self;
CheckError(AudioUnitSetProperty(ioUnit,
kAudioOutputUnitProperty_SetInputCallback,
kAudioUnitScope_Global,
kInputBus,
&callbackStruct,
sizeof(callbackStruct)), "Couldn't set input callback");
AURenderCallbackStruct callbackStruct2;
callbackStruct2.inputProc = playbackCallback;
callbackStruct2.inputProcRefCon = (__bridge void * _Nullable)self;
CheckError(AudioUnitSetProperty(ioUnit,
kAudioUnitProperty_SetRenderCallback,
kAudioUnitScope_Global,
kOutputBus,
&callbackStruct,
sizeof(callbackStruct)), "Couldn't set input callback");
CheckError(AudioUnitInitialize(ioUnit), "Couldn't initialize input unit");
CheckError(AudioOutputUnitStart(ioUnit), "AudioOutputUnitStart failed");
}
I am experience weird behavior in the callbacks. Firstly, the playbackCallback
function is not called at all, despite setting its property in an identical fashion as the one from the tutorial (the tutorial is by the guy who wrote the Loopy app).
Secondly, the input callback has an ioData (audioBufferList) parameter which should be null (according to the documentation) but is flipping between null and having a non-nil value on every second callback. Does this make sense to any one?
Additionally, calling audiounitRender
in the input callback (the semantics of which i still don't understand in terms of API logic and lifecycle etc..) leads to a -50 error, which is very generic "bad params". This is most likely due to an invalid "topology" of the audiobufferlist
i.e. interleaved/deinterleaved, numer of channel, etc... However, I've tried the various topologies and none have resulted in no error. And that also doesn't explain the weird ioData behavior. HERE is the function for reference:
OSStatus inputCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList * ioData)
{
MicController *myRefCon = (__bridge MicController *)inRefCon;
CheckError(AudioUnitRender(myRefCon.remoteIOUnit,
ioActionFlags,
inTimeStamp,
inBusNumber,
inNumberFrames,
myRefCon.inputBuffer), "audio unit render");
return noErr;
}
I believe that my experience may be due to some simple errors in formatting or possibly using the wrong bus on the wrong scope or some other trivial (and easy to make in a core audio context error). However, because I fundamentally don't have an intuition for the semantics and lifecycle flow (scheme?, i don't even know what word to use), I cannot adequately debug this. I would greatly appreciate some help from a more experienced core audio programmer that might shed some light on this situation.
Your kAudioUnitProperty_SetRenderCallback property setter is using callbackStruct instead of callbackStruct2. Thus your RemoteIO Audio Unit is calling inputCallback() twice instead of playbackCallback().