Hi,
I am having major(!) problems with render callbacks in a multichannel mixer application.
Basically, I intend to loop audio files of different lengths and so I had looked at the iPhone MixerEQGraph example and extended it to take in 8 audio files on 8 seperate busses.
The problem lies in setting up a functioning render callback for each bus of a different loop length.
Say I have 4 busses with 4 bar loops, 2 with 8 bar loops and another 2 with 16 bar loops. If I want them all to playback fully I think I need seperate render input functions for all 3 cases, to accommodate the different file lengths.
Curiously, I can get seperate render callbacks to function for each loop set if I create seperate structs for each loopset and then attach these to the mixer but the render input callbacks still do not work....
I have now tried implementing this in a number of different ways, but the best I can get it truncated loops of 4 bars for all busses!
I have looked at all the examples I can find, Apple WDC podcasts and trawled the API's- I know I'm missing something fairly fundamental :-)
Has anyone encountered this problem?
I would greatly appreciated any help or advice that anyone may have.
Thanks in advance,
Charlie
Dr Charlie Cullen
Head of Multimodal Interaction Group (MmIG)
Digital Media Centre
Dublin Institute of Technology
Aungier Street
Dublin 2
tel: +353 1 402 3273
mob: +353 86 841 0697
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden