• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
RE: iPhone audio decompression
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: iPhone audio decompression


  • Subject: RE: iPhone audio decompression
  • From: Luke Hollingworth <email@hidden>
  • Date: Fri, 24 Oct 2008 09:20:21 +0000
  • Importance: Normal

Hi Bill,

Thank you very much for the reply. I didn't fully understand what you meant by,
 
"but you can't decode more than one compressed stream at a time, so mp3 or aac is not going to help."

What I understood from this is that I can only decode one file at at time. If this is correct then this approach is fine because it's longer background songs/tracks that take up all the space and only one of these would be played at once. Therefore am trying to implement the AQOffline render function but the documentation is limited.

I've looked through the posts on this matter and none seem to have found a solution of how to get it to work. From my best guesses the approach would be to set up an output Audio Queue in the manner shown in the Audio Queue Services Reference playback example, but instead of starting the audio queue the AudioQueueSetOfflineRenderFormat should be called to set the required decoded format of the audio and then the AudioQueueOfflineRender function should be called with a different AudioQueueBuffer than those used for supplying compressed audio to the queue.

However, all functions return no error status but when AudioQueueOfflineRender is called repeatedly to render audio my HandleOutputBuffer function is never called to request more compressed data for the audio queue.

Some further directions on how this process should be performed or whether in fact my implementation is correct but there is a problem with the API would be greatly appreciated. The code added to the Playback Audio Queue example instead of the AudioQueueStart is as follows;

//Setup the rendered output format of the AudioQueue - interleaved as only that is allowed
    AudioStreamBasicDescription outputFormat;
    outputFormat.mSampleRate = 44100;
    outputFormat.mFormatID = kAudioFormatLinearPCM;
    outputFormat.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
    outputFormat.mFramesPerPacket = 1;
    outputFormat.mChannelsPerFrame = mAQData.mDataFormat.mChannelsPerFrame;
    outputFormat.mBytesPerFrame = sizeof(SInt16) * outputFormat.mChannelsPerFrame;
    outputFormat.mBytesPerPacket = outputFormat.mBytesPerFrame * outputFormat.mFramesPerPacket;
    outputFormat.mBitsPerChannel = (outputFormat.mBytesPerFrame / outputFormat.mChannelsPerFrame) * 8;

//Setup the channel layout of the rendered output format of the AudioQueue
    AudioChannelLayout layout;
    layout.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo;
    layout.mChannelBitmap = 0;
    layout.mNumberChannelDescriptions = 0;
   
//sets the queue to offline rendering
    status = AudioQueueSetOfflineRenderFormat ( mAQData.mQueue, &outputFormat, &layout );

//Create a time stamp set the start of the queue using the sample time part of the structure
    AudioTimeStamp timeStamp;
    timeStamp.mFlags = kAudioTimeStampSampleTimeValid;
    timeStamp.mSampleTime = 0;

//Allocate an audio queue buffer to redcieve the rendered audio
    AudioQueueBufferRef outBuff;
    status = AudioQueueAllocateBuffer(mAQData.mQueue, outputFormat.mBytesPerFrame * 1152, &outBuff);

//Setup a wav file to write the audio - this is a self made class that is full tested and functional
    WavWrite* writer = [[WavWrite alloc] init];
    [writer initWithFilePath:"/output.wav" numChannels:outputFormat.mChannelsPerFrame sampleType:kPCMStyle_SInt16];

//Render the audio file until it reaches the end of the file or times out -
    UInt32 count = 0;
    while( !mAQData.mEOF && count++ < 1000000)
    {
        status = AudioQueueOfflineRender(mAQData.mQueue, &timeStamp, outBuff, 1152);
        timeStamp.mSampleTime += 1152;
        [writer writeAudio:outBuff->mAudioData numFrames:1152];
    }
   
    [writer closeFile];
    [writer release];





Click here for FREE customisable desktop wallpapers. Get them Now!
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

References: 
 >Re: iPhone audio decompression (From: Luke Hollingworth <email@hidden>)
 >Re: iPhone audio decompression (From: William Stewart <email@hidden>)

  • Prev by Date: Newbie trying to play mp3 files instead wav files
  • Next by Date: Rép : Problems implementing kAudioDevicePropertyIOProcStreamUsage property in a user-land CoreAudio device (2)
  • Previous by thread: Re: iPhone audio decompression
  • Next by thread: Re: iPhone audio decompression
  • Index(es):
    • Date
    • Thread