Re: AUGraph recording iPhone
Re: AUGraph recording iPhone
- Subject: Re: AUGraph recording iPhone
- From: William Stewart <email@hidden>
- Date: Tue, 8 Dec 2009 14:54:52 -0800
On Dec 8, 2009, at 2:31 PM, Bruce Meagher wrote:
Thank you for the response Bill. That's how I have it working now
(setup the remoteIO callback directly, AudioUnitRender in the
callback which copies data to buffer), however I then pull the audio
samples out of the buffer from a different thread (since I want a
larger window and don't need any of the processed audio for the
output). If I don't need the data for the output are there other
reasons I should process the data in the output cycle?
nope - totally up to you and what you want to do with the data
It just seemed that since I was using the AUGraph there would be
something similar to AUGraphSetNodeInputCallback (like an
AUGraphSetNodeOutputCallback) I should be using for the grabbing
output of a particular node in the graph. Thanks for setting me
straight.
On my second question I've seen several places on the web (like <http://blog.faberacoustical.com/2009/iphone/iphone-microphone-frequency-response-comparison/
>) that show some measured performance of iPhone mic h/w, but is
there any Apple documentation available to us register developers
about the different iphone/ipod audio h/w specs?
not that I know of
Bill
Thanks,
Bruce
On Dec 8, 2009, at 10:45 AM, William Stewart wrote:
On Dec 8, 2009, at 10:11 AM, Bruce Meagher wrote:
Hi All,
I currently have an iPhone app that plays multiple streams of
audio through a multichannel mixer connection to the remoteIO
audio unit that was setup using an AUGraph. I now want to add the
ability to process the incoming audio from the built-in mic. I
was able to record the audio through the remoteIO audio unit if I
used :
AudioUnitSetProperty(mRemoteIO,
kAudioOutputUnitProperty_SetInputCallback, kAudioUnitScope_Output,
1, &rcbs, sizeof(rcbs));
AudioUnitSetProperty(mRemoteIO, kAudioOutputUnitProperty_EnableIO,
kAudioUnitScope_Input, 1, &one, sizeof(one));
However, it seems like I should be using the AUGraph calls since
the graph is kind of managing my audio units. Is the correct
approach to use AUGraphAddRenderNotify to grab the mic audio from
the remoteIO audio unit (didn't get this to work),
You have the callback that you establish on the remote I/O unit
that will tell you when it has audio avaiable (and how much it
has). So, I would use that. Then I would have that call audio unit
render on the AU to get the input data and have it render that data
into a buffer. Then when the output side fires (the input side does
first), you can be reading the input data from this buffer to
process, etc for that output cycle.
The aurioTouch is a reasonable example of how to do audio thru
using audio units, and it is not that much different to have the
output side be a graph that has a callback somewhere in it (from
say one of the inputs of your mixer) to get the audio data
or is the correct way to configure the audio unit directly, or
something else?
Also, is there any Apple documentation (or API call) available
that describes the specs of microphone for the different iPhone
platforms (e.g. supported sample rates, mono or stereo, any
filter characteristics, etc.)?
Thanks,
Bruce
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden