Hi Pier,
This is something I’m from a blog post I’m working on. Maybe it will help clarify...
It’s a common misconception to think of RemoteIO is a “generator” or “processor”. As far as I can tell, in iOS it’s really just an input (from the mic) and output (to the hardware). “Generators” - as in things that feed in audio – are accomplished by attaching a Render Callback, which you can do to any Audio Unit. Because of iOS’s limited support of AU’s (its growing) it’s a common design pattern to only have a single unit (a RemoteIO) with a single Render Callback which handles all the audio. It’s often attached to the RemoteIO’s input channel 0. It’s easy to think about the pair as effectively being a single “custom” audio unit but it’s not...
Also, there are 3 types of Render Callbacks (RCBs) really. The first, which I like to call “Generators”, attach to an input bus. They read audio from a ring buffer or synthesize audio and feed it to the input buffer. “Capturers” have the same structure but attach to the output bus (or the input bus for mic input capturing??) and grab the audio to do something with it. You might think an app which records you playing would do so via a Capturer, but this too is a mistake. If you attach an RCB to RemoteIO’s output bus 0 then the audio doesn’t get fed into the output hardware, it seems.
Instead you need the third type which I call a Processor. It’s still an RCB (attached this time via AudioUnitAddRenderNotify()) and many people use them in lieu of an input bus RCB to *generate* audio – that’s fine. Their code needs to be slightly different because, unlike input and output bus RCBs the audio unit calls it *twice* for *each bus* on every round – once with the kAudioUnitRenderAction_PreRender flag set and one with the ...PostRender flag. Your RCB code needs to begin with a conditional to prevent it from running when it shouldn’t. For example:
// Only do pre render on bus 1
if (*ioActionFlags & kAudioUnitRenderAction_PostRender || inBusNumber != 1) {
return noErr;
}
So for your scenario, you want one of these “Processor” Render Callbacks on your RemoteIO. You then store the data in ioData into an external ring buffer (like Mike Tyson’s rightfully famous TPCircularBuffer) pointed to by your inRefCon. You can then pull from this ring buffer in your “Generator” render callback attached to your mixer input.
Again though, I’ve not done much work with the mic. In an effort to deal with the single RemoteIO limitation but still use the AUMultichannelMixer, this setup is a bit circular (will the Mixer’s input RCB be called *after* the RemoteIO’s Pre_Render phase even though the Mixer occurs first in the graph? anyone?), so if it doesn’t work as expected, I’m sorry! Hopefully someone else with experience rocking the mic will chime in...
Accelerate is a set of functions for leveraging the ARM processors ultra-fast floating point vector operations, “vector” being synonymous with “buffer”. The vDSP functions are the ones to look at: https://developer.apple.com/library/mac/#documentation/Accelerate/Reference/vDSPRef/Reference/reference.html
For example to add (mix) two (Float32) buffers:
vDSP_vadd(source1Ptr, 1, source2Ptr, 1, destBufferPtr, 1, sizeInFrames);
The “1”’s are “stride” = the difference in array indices between two related samples. If your buffers were interlaced you’d use 2.
Hope that gets you started...
Gratefully,
Hari Karam Singh
══════════════════════════════════════
"Finally, an app that breaks you out of the 2 inch screen!"
Sound Wand ~ A real musical instrument that fits in your pocket!
http://soundwandapp.com | Tw | Fb | Yt | Sc
══════════════════════════════════════
Club 15CC
An Inspirational Media and Technology Studio
London, UK
+44 (0) 207 394 8587
+44 (0) 779 055 6418
http://club15cc.com/
Part of Amritvela
From: Pier [mailto:email@hidden]
Sent: 06 November 2012 10:41
To: email@hidden
Cc: email@hidden
Subject: Re: RemoteIO Question
Hi Hari,
Thanks for confirming my suspicions. I was just reading your blog this morning regarding mixing - not sure I understand it all though!
I'm not sure what this means : 2. Render Notify attached to RemoteIO which processes the prerender phase on bus 1 (i.e., if (bus != 1 || !(ioFlags & k...PreRender)) return noErr; ) and which stores the mic data in a ring buffer
Could you explain a bit more? :D
I was tempted to go the manual mixing route when I read about the Multichannel Mixer AudioUnit. This is the first time I've heard of the Accelerate framework - what advantages does it offer with regards to this?
On Tue, Nov 6, 2012 at 6:23 PM, Hari Karam Singh <email@hidden> wrote:
Hi Pier,
I’ve just been through something similar. I believe that answer is yes, you may only have one RemoteIO unit. Also you definitely need a RemoteIO as the last unit to send the audio to the hardware. Reference: http://lists.apple.com/archives/coreaudio-api/2009/Apr/msg00042.html
I’ve not much experience with using the mic, but I would think you could accomplish what you need with the following setup:
1. Mixer output bus 0 -> Remote IO Input bus 0
2. Render Notify attached to RemoteIO which processes the prerender phase on bus 1 (i.e., if (bus != 1 || !(ioFlags & k...PreRender)) return noErr; ) and which stores the mic data in a ring buffer
3. Attach a custom render callback to Mixer input bus 0 which reads from the ring buffer
4. Attach your other source to mixer input bus 1
You could also skip the mixer unit entirely and do the mixing manually inside of the RemoteIO’s render notify callback using the lightning fast vDSP functions in the Accelerate framework.
Gratefully,
Hari Karam Singh
http://soundwandapp.com/
http://club15cc.com/
**RemoteIO1 (for recording to buffer) -> Mixer -> RemoteIO2 (for playback of output)**
RemoteIO1 is used for 2 purposes
1) To feed audio into the mixer channel 0
2) To record audio from mic to a buffer
1) Takes audio from RemoteIO - input 0
2) Mixes the audio from (1) with audio from the buffer - input1
1) Takes the mixed audio and sends it to playback
Initially I thought that I could just playback from mixer output but the following gives me an error. Can I confirm that I need another RemoteIO to do playback?
// Enable Mixer for playback
status = AudioUnitSetProperty(_mixerUnit,
kAudioOutputUnitProperty_EnableIO,
if (noErr != status) { NSLog(@"Enable Mixer for playback error"); return; }
Also, I did the following test and realised there seems to be only one RemoteIO available (addresses for inputComponent and inputComponent2 are the same)
AudioComponent inputComponent = AudioComponentFindNext(NULL, &desc);
AudioComponent inputComponent2 = AudioComponentFindNext(NULL, &desc);
Is it true that I can only have one instance of RemoteIO in my app? If so, what are the alternatives for the 2nd RemoteIO?