Instead you need the third type which I call a Processor. It’s still an RCB (attached this time via AudioUnitAddRenderNotify()) and many people use them in lieu of an input bus RCB to *generate* audio – that’s fine. Their code needs to be slightly different because, unlike input and output bus RCBs the audio unit calls it *twice* for *each bus* on every round – once with the kAudioUnitRenderAction_PreRender flag set and one with the ...PostRender flag. Your RCB code needs to begin with a conditional to prevent it from running when it shouldn’t. For example:
// Only do pre render on bus 1
if (*ioActionFlags & kAudioUnitRenderAction_PostRender || inBusNumber != 1) {
return noErr;
}
So for your scenario, you want one of these “Processor” Render Callbacks on your RemoteIO. You then store the data in ioData into an external ring buffer (like Mike Tyson’s rightfully famous TPCircularBuffer) pointed to by your inRefCon. You can then pull from this ring buffer in your “Generator” render callback attached to your mixer input.
Again though, I’ve not done much work with the mic. In an effort to deal with the single RemoteIO limitation but still use the AUMultichannelMixer, this setup is a bit circular (will the Mixer’s input RCB be called *after* the RemoteIO’s Pre_Render phase even though the Mixer occurs first in the graph? anyone?), so if it doesn’t work as expected, I’m sorry! Hopefully someone else with experience rocking the mic will chime in...
Accelerate is a set of functions for leveraging the ARM processors ultra-fast floating point vector operations, “vector” being synonymous with “buffer”. The vDSP functions are the ones to look at: https://developer.apple.com/library/mac/#documentation/Accelerate/Reference/vDSPRef/Reference/reference.html
For example to add (mix) two (Float32) buffers:
vDSP_vadd(source1Ptr, 1, source2Ptr, 1, destBufferPtr, 1, sizeInFrames);
The “1”’s are “stride” = the difference in array indices between two related samples. If your buffers were interlaced you’d use 2.
Hope that gets you started...
Gratefully,
Hari Karam Singh
══════════════════════════════════════
"Finally, an app that breaks you out of the 2 inch screen!"
Sound Wand ~ A real musical instrument that fits in your pocket!
http://soundwandapp.com | Tw | Fb | Yt | Sc
══════════════════════════════════════
Club 15CC
An Inspirational Media and Technology Studio
London, UK
+44 (0) 207 394 8587
+44 (0) 779 055 6418
http://club15cc.com/
Part of Amritvela
Hi Hari,
Thanks for confirming my suspicions. I was just reading your blog this morning regarding mixing - not sure I understand it all though!
I'm not sure what this means : 2. Render Notify attached to RemoteIO which processes the prerender phase on bus 1 (i.e., if (bus != 1 || !(ioFlags & k...PreRender)) return noErr; ) and which stores the mic data in a ring buffer
Could you explain a bit more? :D
I was tempted to go the manual mixing route when I read about the Multichannel Mixer AudioUnit. This is the first time I've heard of the Accelerate framework - what advantages does it offer with regards to this?
On Tue, Nov 6, 2012 at 6:23 PM, Hari Karam Singh <email@hidden> wrote:
Hi Pier,
I’ve just been through something similar. I believe that answer is yes, you may only have one RemoteIO unit. Also you definitely need a RemoteIO as the last unit to send the audio to the hardware. Reference: http://lists.apple.com/archives/coreaudio-api/2009/Apr/msg00042.html
I’ve not much experience with using the mic, but I would think you could accomplish what you need with the following setup:
1. Mixer output bus 0 -> Remote IO Input bus 0
2. Render Notify attached to RemoteIO which processes the prerender phase on bus 1 (i.e., if (bus != 1 || !(ioFlags & k...PreRender)) return noErr; ) and which stores the mic data in a ring buffer
3. Attach a custom render callback to Mixer input bus 0 which reads from the ring buffer
4. Attach your other source to mixer input bus 1
You could also skip the mixer unit entirely and do the mixing manually inside of the RemoteIO’s render notify callback using the lightning fast vDSP functions in the Accelerate framework.
Gratefully,
Hari Karam Singh
http://soundwandapp.com/
http://club15cc.com/
**RemoteIO1 (for recording to buffer) -> Mixer -> RemoteIO2 (for playback of output)**
RemoteIO1 is used for 2 purposes
1) To feed audio into the mixer channel 0
2) To record audio from mic to a buffer
1) Takes audio from RemoteIO - input 0
2) Mixes the audio from (1) with audio from the buffer - input1
1) Takes the mixed audio and sends it to playback
Initially I thought that I could just playback from mixer output but the following gives me an error. Can I confirm that I need another RemoteIO to do playback?
// Enable Mixer for playback
status = AudioUnitSetProperty(_mixerUnit,
kAudioOutputUnitProperty_EnableIO,
if (noErr != status) { NSLog(@"Enable Mixer for playback error"); return; }
Also, I did the following test and realised there seems to be only one RemoteIO available (addresses for inputComponent and inputComponent2 are the same)
AudioComponent inputComponent = AudioComponentFindNext(NULL, &desc);
AudioComponent inputComponent2 = AudioComponentFindNext(NULL, &desc);
Is it true that I can only have one instance of RemoteIO in my app? If so, what are the alternatives for the 2nd RemoteIO?