Re: ios processing graph
Re: ios processing graph
- Subject: Re: ios processing graph
- From: Patrick Shirkey <email@hidden>
- Date: Sat, 01 Oct 2011 06:01:14 +0200 (CEST)
- Importance: Normal
Hi,
Following up on this one with some code. I'm trying to add a delay node
to the augraph using the code from the 'iPhoneMixerEqGraphTest' example
project. If anyone has the time to enlighten me as to why the call to add
a new node is failing your assistance would be much appreciated.
- (void)initializeAUGraph
{
printf("initializeAUGraph\n");
AUNode outputNode;
AUNode eqNode;
AUNode mixerNode;
AUNode delayNode;
printf("new AUGraph\n");
// create a new AUGraph
result = NewAUGraph(&mGraph);
if (result) { printf("NewAUGraph result %ld X %4.4s\n", result,
(unsigned int)result, (char*)&result); return; }
// create three CAComponentDescription for the AUs we want in the graph
// output unit
CAComponentDescription output_desc(kAudioUnitType_Output,
kAudioUnitSubType_RemoteIO, kAudioUnitManufacturer_Apple);
// delay unit
// CAComponentDescription delay_desc(kAudioUnitType_Output,
kAudioUnitSubType_RemoteIO, kAudioUnitManufacturer_Apple);
// iPodEQ unit
CAComponentDescription eq_desc(kAudioUnitType_Effect,
kAudioUnitSubType_AUiPodEQ, kAudioUnitManufacturer_Apple);
// multichannel mixer unit
CAComponentDescription mixer_desc(kAudioUnitType_Mixer,
kAudioUnitSubType_MultiChannelMixer, kAudioUnitManufacturer_Apple);
printf("add nodes\n");
// create a node in the graph that is an AudioUnit, using the supplied
AudioComponentDescription to find and open that unit
result = AUGraphAddNode(mGraph, &output_desc, &outputNode);
if (result) { printf("AUGraphNewNode 1 result %lu %4.4s\n", result,
(char*)&result); return; }
result = AUGraphAddNode(mGraph, &delay_desc, &delayNode);
if (result) { printf("AUGraphNewNode 2 result %lu %4.4s\n", result,
(char*)&result); return; }
....
}
> Hi,
>
> Apologies in advance for the n00b question. I have been researching the
> docs for the processing graph on the iphone. I am working from the
> aurioTouch example and the multichannel playback examples in the ios dev
> portal.
>
> I have also found the blog posts by Michael Tyson, Tim Bolstad and
> downloaded code from github by Brennon Bortz and Marek Bereza.
>
> So, after doing all that and reading all the code surely by now I would
> have figured out how to make it all "Just Work(tm)".
>
> Alas I am still struggling to get my head around the fundamental process
> of connecting the signal from the eqNode to the delayNode to the
> mixerNode. I understand that there is a render callback involved and the
> RemoteIO unit. But I don't see how it is supposed to fit together.
>
>
> Cheers
>
--
Patrick Shirkey
Boost Hardware Ltd
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden