• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
QA1562 offline audio rendering advice
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

QA1562 offline audio rendering advice


  • Subject: QA1562 offline audio rendering advice
  • From: Bruce Meagher <email@hidden>
  • Date: Wed, 4 Mar 2009 17:17:47 -0800

I've been struggling to get offline audio rendering working on the iphone (aac -> lpcm) and was wondering if any of you might have some suggestions. I started with the c++ code attached to the tech note QA1562 on the developer website, modified it to fit in an Objective-C class, and all runs well in the simulator. The code renders my aac files to lpcm files. The rendered files playback fine and are the correct size.

However when I try to run it on the actual iphone I get an error when calling AudioQueueStart (error = -66681 kAudioQueueErr_CannotStart). If I just take out the calls to AudioQueueSetOfflineRenderFormat and AudioQueueOfflineRender the AAC sound file plays through the hardware path out to the speaker so I believe I'm setting up the queue correctly.

As soon as I include the call to AudioQueueSetOfflineRenderFormat I get the error in AudioQueueStart. Below is the clip of the of the call

AudioStreamBasicDescription captureFormat;
captureFormat.mSampleRate = mDataFormat.mSampleRate;
captureFormat.mFormatID = kAudioFormatLinearPCM;
captureFormat.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
captureFormat.mFramesPerPacket = 1;
captureFormat.mChannelsPerFrame = mDataFormat.mChannelsPerFrame;
captureFormat.mBytesPerFrame = sizeof(SInt16) * captureFormat.mChannelsPerFrame;
captureFormat.mBytesPerPacket = captureFormat.mBytesPerFrame * captureFormat.mFramesPerPacket;
captureFormat.mBitsPerChannel = (captureFormat.mBytesPerFrame / captureFormat.mChannelsPerFrame) * 8;
captureFormat.mReserved = 0;

result = AudioQueueSetOfflineRenderFormat(mQueue, &captureFormat, acl);


The channel layout is copied from the file just as in the sample code. I've tried single channel as well as two channel aac audio files as well as just plain lpcm files to no avail. The AudioStreamBasicDescription logged for a two channel file is:

Test[723:20b] captureFormat mSampleRate = 44100.000000 mFormatID = 6c70636d, mFormatFlags = 12, mBytesPerPacket = 4 mFramesPerPacket = 1 mBytesPerFrame = 4 mChannelsPerFrame = 2 mBitsPerChannel = 16 mReserved = 0

I'm running Phone OS 2.1.

The sample code calls "captureFormat.SetAUCanonical(myInfo.mDataFormat.mChannelsPerFrame, true); // interleaved" to setup up the output format but I couldn't find this function in the docs or an equivalent call in AudioStreamBasicDescription.

Any suggestions where I might be going astray or how to debug kAudioQueueErr_CannotStart?

Thanks,

Bruce
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


  • Follow-Ups:
    • Re: QA1562 offline audio rendering advice
      • From: Doug Wyatt <email@hidden>
    • Re: QA1562 offline audio rendering advice
      • From: William Stewart <email@hidden>
  • Prev by Date: Re: CAClockSetCurrentTime Help
  • Next by Date: Re: Audio Stops when SubGraph reconnects
  • Previous by thread: Re: Audio Stops when SubGraph reconnects
  • Next by thread: Re: QA1562 offline audio rendering advice
  • Index(es):
    • Date
    • Thread