Doug, thank for the reply.
I don't think that's it; I do call
AudioQueueSetOfflineRenderFormat.
My sequence of Core Audio function calls goes as follows:
- AudioFileStreamOpen(self, propertyCallback, packetCallback,
hint, &parserID);
- err = AudioFileStreamParseBytes(parserID, inBufSize, inBuffer,
kAudioFileStreamParseFlag_Discontinuity );
(in propertyCallback)
- err = AudioFileStreamGetProperty( inAudioFileStream,
kAudioFileStreamProperty_DataFormat, &asbdSize, &asbd);
- err = AudioQueueNewOutput(&asbd, audioQueueCallback, inBuffer,
NULL, NULL, 0, &queue);
- AudioQueueAllocateBuffer( queue, nn, &queueBuffer );
-AudioQueueSetOfflineRenderFormat( queue, &asbd, &chLayout );
(in propertyCallback)
- err = AudioQueueOfflineRender( dcdrData->queue, &ts,
outBuffer, asbd.mFramesPerPacket );
( ts is a time stamp with ts.mSampleTime=0 & ts.mHostTime
= sys clock time)
Here I get err = -66626 and outBuffer seems to contain only 0s.
On May 21, 2009, at 6:24 PM, Doug Wyatt wrote:
Maybe you didn't call AudioQueueSetOfflineRenderFormat first?