Offline rendering in a AUGraph using a Generic Output node on iOS
Offline rendering in a AUGraph using a Generic Output node on iOS
- Subject: Offline rendering in a AUGraph using a Generic Output node on iOS
- From: David Blake <email@hidden>
- Date: Sun, 03 Mar 2013 23:55:05 +1100
Hi,
I have been trying to enable offline rending in my app. From what I understand, the process of doing this is to use a Generic Output node instead of a Remote IO node, and then call AudioUnitRender on the generic output node manually as quick as you need.
In my research however no one seems to have an example of how to actually called AudioUnitRender from outside an existing callback. I have also come across historical posts from this list with people not sure if this even possible in a graph and you actually just have to interact with a single node itself? Has anyone achieved this, and if so do you think you could post some example code? I have posted my code below but I am getting the dreaded error -50 when I call AudioUnitRender()
UInt32 inNumberFrames = 256;
AudioTimeStamp inTimeStamp ;
inTimeStamp.mSampleTime = 0;
inTimeStamp.mFlags = kAudioTimeStampSampleTimeValid;
AudioUnitRenderActionFlags ioActionFlags = kAudioOfflineUnitRenderAction_Render;
UInt32 channelCount =1 ;
AudioBufferList* bufferList;
bufferList = (AudioBufferList *) malloc (sizeof (AudioBufferList) + sizeof (AudioBuffer) * (channelCount - 1) );
if (NULL == bufferList) {NSLog (@"*** malloc failure for allocating bufferList memory"); return;}
AudioSampleType* audioDataLeft = (AudioSampleType *) calloc (inNumberFrames, sizeof (AudioSampleType));
// initialize the mNumberBuffers member
bufferList->mNumberBuffers = channelCount;
// initialize the mBuffers member to 0
AudioBuffer emptyBuffer = {0};
size_t arrayIndex;
for (arrayIndex = 0; arrayIndex < channelCount ; arrayIndex++) {
bufferList->mBuffers[arrayIndex] = emptyBuffer;
}
// set up the AudioBuffer structs in the buffer list
bufferList->mBuffers[0].mNumberChannels = 1;
bufferList->mBuffers[0].mDataByteSize = 256 * sizeof (AudioSampleType);
bufferList->mBuffers[0].mData = audioDataLeft;
do {
OSStatus error = noErr;
if ((error = AudioUnitRender(outputUnit, &ioActionFlags, &inTimeStamp, 0, inNumberFrames, bufferList)) != noErr)
{
printf("Cannot AudioUnitRender: %d\n", error);
}
inTimeStamp.mSampleTime += inNumberFrames;
} while (self.exportingEnabled && (inTimeStamp.mSampleTime < 5000 /*this is just so it doesnt get stuck in a loop while testing*/ ));
NSLog(@"render complete");
In the above code outputUnit is of kAudioUnitSubType_GenericOutput subtype, and connected to a mixer via :
result = AUGraphConnectNodeInput ( processingGraph, mixerNodeMaster, 0, iONode, 0 );
The render callback is attacked to the mixerNodeMaster unit. I am calling ExtAudioFileWriteAsync in the callback whilst exportingEnabled is true.
I am using mono signed 16bit int caf files, and have set the mixer unit to expect so. Would this freak the generic output out or can that accept 16bit (AudioSampleType) files no problem?
If anyone can spot a glaring error in what im doing I would be very appreciative if they could point me in the right direction! I have only have one render callback in my app so I could technically do this without the aid of an AUGraph, so if someone has done offline rendering with just the node I'd love to hear from you too.
Best regards,
David
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden