Re: QA1562 offline audio rendering advice
Re: QA1562 offline audio rendering advice
- Subject: Re: QA1562 offline audio rendering advice
- From: William Stewart <email@hidden>
- Date: Wed, 4 Mar 2009 17:34:01 -0800
First thing I would try is running the original code on the iPhone - I
would do that as a matter of course and you haven't said whether you
tried that and that it works
Bill
On Mar 4, 2009, at 5:17 PM, Bruce Meagher wrote:
I've been struggling to get offline audio rendering working on the
iphone (aac -> lpcm) and was wondering if any of you might have
some suggestions. I started with the c++ code attached to the tech
note QA1562 on the developer website, modified it to fit in an
Objective-C class, and all runs well in the simulator. The code
renders my aac files to lpcm files. The rendered files playback
fine and are the correct size.
However when I try to run it on the actual iphone I get an error
when calling AudioQueueStart (error = -66681
kAudioQueueErr_CannotStart). If I just take out the calls to
AudioQueueSetOfflineRenderFormat and AudioQueueOfflineRender the AAC
sound file plays through the hardware path out to the speaker so I
believe I'm setting up the queue correctly.
As soon as I include the call to AudioQueueSetOfflineRenderFormat I
get the error in AudioQueueStart. Below is the clip of the of the
call
AudioStreamBasicDescription captureFormat;
captureFormat.mSampleRate = mDataFormat.mSampleRate;
captureFormat.mFormatID = kAudioFormatLinearPCM;
captureFormat.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger |
kAudioFormatFlagIsPacked;
captureFormat.mFramesPerPacket = 1;
captureFormat.mChannelsPerFrame = mDataFormat.mChannelsPerFrame;
captureFormat.mBytesPerFrame = sizeof(SInt16) *
captureFormat.mChannelsPerFrame;
captureFormat.mBytesPerPacket = captureFormat.mBytesPerFrame *
captureFormat.mFramesPerPacket;
captureFormat.mBitsPerChannel = (captureFormat.mBytesPerFrame /
captureFormat.mChannelsPerFrame) * 8;
captureFormat.mReserved = 0;
result = AudioQueueSetOfflineRenderFormat(mQueue, &captureFormat,
acl);
The channel layout is copied from the file just as in the sample
code. I've tried single channel as well as two channel aac audio
files as well as just plain lpcm files to no avail. The
AudioStreamBasicDescription logged for a two channel file is:
Test[723:20b] captureFormat mSampleRate = 44100.000000 mFormatID =
6c70636d, mFormatFlags = 12, mBytesPerPacket = 4 mFramesPerPacket =
1 mBytesPerFrame = 4 mChannelsPerFrame = 2 mBitsPerChannel = 16
mReserved = 0
I'm running Phone OS 2.1.
The sample code calls
"captureFormat.SetAUCanonical(myInfo.mDataFormat.mChannelsPerFrame,
true); // interleaved" to setup up the output format but I couldn't
find this function in the docs or an equivalent call in
AudioStreamBasicDescription.
Any suggestions where I might be going astray or how to debug
kAudioQueueErr_CannotStart?
Thanks,
Bruce
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden