Re: QA1562 offline audio rendering advice
Re: QA1562 offline audio rendering advice
- Subject: Re: QA1562 offline audio rendering advice
- From: Bruce Meagher <email@hidden>
- Date: Thu, 5 Mar 2009 17:13:02 -0800
Hi Doug,
Thanks for the tip on version 2.2. I was able to get aac -> lpcm
offline rendering working on the device as described in the tech note
against the iphone version 2.2 but not against 2.1. A couple questions:
1) The aqrender.cpp file includes two header files "CAXException.h"
and "CAStreamBasicDescription.h" that I couldn't find in any of the
audio frameworks. I can't seem to find the missing pieces to make the
code run "as is" on the target, but I'm probably just missing
something simple. I ported the sample code to an Objective C class,
but as Bill mentions below it would be good to run the original code.
The sample code appears to be more of a command line tool with the
file names coming in from main's argv and agrc. Is there a shell app
I can run to download and test command line utilities to the phone?
2) Are there any published performance specs w.r.t. offline
rendering? On the macbook pro running the iphone simulator my app
decompresses 128kbps 44.1k 2ch aac audio file to lpcm about 100x
faster than realtime (e.g. 120 second AAC audio file decompresses in
1.2 seconds to 2ch lpcm including writes to the disk). On same app
the iphone I see a little over 2.5x faster than realtime (e.g. same
120 second AAC audio file decompresses in ~48 seconds). Removing the
writes to disk on the phone only saves a few percent. If I drop the
sample rate of the file 32kHz it runs in 38 seconds (~3x faster than
realtime). I'm not comparing the iphone to a mac, but should I expect
a 2.5x aac ->lpcm rendering speed on the iphone h/w? I'm sure there
are many variables but any guidelines would be great to know I'm on
the right track.
3) Any tips how I might try to get this to work with version 2.1? Have
any of you developers out there had success offline rendering aac -
>lpcm with iphone version 2.1 (I thought there were many)?
Thanks,
Bruce
On Mar 4, 2009, at 6:53 PM, Doug Wyatt wrote:
The tech note was developed based on OS 2.2. I don't believe it was
tried at all on 2.1, and I have a suspicion that you're encountering
a bug that was fixed in 2.2.
Doug
On Mar 4, 2009, at 5:34 PM, William Stewart wrote:
First thing I would try is running the original code on the iPhone -
I would do that as a matter of course and you haven't said whether
you tried that and that it works
Bill
On Mar 4, 2009, at 5:17 PM, Bruce Meagher wrote:
I've been struggling to get offline audio rendering working on the
iphone (aac -> lpcm) and was wondering if any of you might have
some suggestions. I started with the c++ code attached to the tech
note QA1562 on the developer website, modified it to fit in an
Objective-C class, and all runs well in the simulator. The code
renders my aac files to lpcm files. The rendered files playback
fine and are the correct size.
However when I try to run it on the actual iphone I get an error
when calling AudioQueueStart (error = -66681
kAudioQueueErr_CannotStart). If I just take out the calls to
AudioQueueSetOfflineRenderFormat and AudioQueueOfflineRender the
AAC sound file plays through the hardware path out to the speaker
so I believe I'm setting up the queue correctly.
As soon as I include the call to AudioQueueSetOfflineRenderFormat I
get the error in AudioQueueStart. Below is the clip of the of the
call
AudioStreamBasicDescription captureFormat;
captureFormat.mSampleRate = mDataFormat.mSampleRate;
captureFormat.mFormatID = kAudioFormatLinearPCM;
captureFormat.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger |
kAudioFormatFlagIsPacked;
captureFormat.mFramesPerPacket = 1;
captureFormat.mChannelsPerFrame = mDataFormat.mChannelsPerFrame;
captureFormat.mBytesPerFrame = sizeof(SInt16) *
captureFormat.mChannelsPerFrame;
captureFormat.mBytesPerPacket = captureFormat.mBytesPerFrame *
captureFormat.mFramesPerPacket;
captureFormat.mBitsPerChannel = (captureFormat.mBytesPerFrame /
captureFormat.mChannelsPerFrame) * 8;
captureFormat.mReserved = 0;
result = AudioQueueSetOfflineRenderFormat(mQueue, &captureFormat,
acl);
The channel layout is copied from the file just as in the sample
code. I've tried single channel as well as two channel aac audio
files as well as just plain lpcm files to no avail. The
AudioStreamBasicDescription logged for a two channel file is:
Test[723:20b] captureFormat mSampleRate = 44100.000000 mFormatID =
6c70636d, mFormatFlags = 12, mBytesPerPacket = 4 mFramesPerPacket =
1 mBytesPerFrame = 4 mChannelsPerFrame = 2 mBitsPerChannel = 16
mReserved = 0
I'm running Phone OS 2.1.
The sample code calls
"captureFormat.SetAUCanonical(myInfo.mDataFormat.mChannelsPerFrame,
true); // interleaved" to setup up the output format but I couldn't
find this function in the docs or an equivalent call in
AudioStreamBasicDescription.
Any suggestions where I might be going astray or how to debug
kAudioQueueErr_CannotStart?
Thanks,
Bruce
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
@apple.com
This email sent to email@hidden
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden