• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
How to Use AudioQueueOfflineRender?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

How to Use AudioQueueOfflineRender?


  • Subject: How to Use AudioQueueOfflineRender?
  • From: "Louis Valentine" <email@hidden>
  • Date: Mon, 21 Jul 2008 10:44:55 -0700
  • Thread-topic: How to Use AudioQueueOfflineRender?

Does anyone have experience using the AudioQueue : AudioQueueOfflineRender() function? The documentation is sorely lacking on the exact method needed to use this function.

 

What I am trying to do is convert a sound file that is currently in ima4 ADPCM format into a buffer of regular linear PCM data. I am able to load and play the sound using an Audio Queue, and it works fine. However, I don't want to play the sound, I want to render it into a buffer instead. So what I have tried to do is replace the AudioQueueStart() call with a call to AudioQueueSetOfflineRenderFormat() to fill an output buffer with LPCM, mono, 16-bit data with something like this:

 

-----------------

 

 

struct AudioChannelLayout layout =

{

kAudioChannelLayoutTag_Mono | kAudioChannelLayoutTag_UseChannelBitmap,

kAudioChannelBit_Center,

0,

NULL

};

AudioStreamBasicDescription streamDesc;

FillOutASBDForLPCM(streamDesc, 44100, 1, 16, 16, false, false);

 

AudioQueueSetOfflineRenderFormat(audioQueue, &streamDesc, &layout);

 

-----------------

 

Everything seems to work, since all the functions return the proper result code, and the audio queue callback function is called so it seems like the buffer is being processed, but no matter when I call AudioQueueOfflineRender(), the output buffer that I specify is always full of empty data. I've tried calling AudioQueueOfflineRender() right after calling AudioQueueOfflineRenderFormat() and also tried calling it during the callback function, but in both cases the output data is empty, although it is the correct size so I end up with 2 seconds of silence in my output buffer.

 

I did notice a post to this mailing list that seemed to be talking about the same issue:

 

http://lists.apple.com/archives/Coreaudio-api/2008/Feb/msg00126.html

 

He says that he called AudioQueueOfflineRender() during a "render callback" and it worked. How do you receieve a render callback (assuming this is different the the regular audio queue callback)?

 

If anyone has any tips or example code on how to properly make use of the AudioQueueOfflineRender() function, I'd greatly appreciate it. Thanks!

-Louis

 

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

  • Prev by Date: Re: Changing nominalSampleRate considered harmful?
  • Next by Date: Re: Native sample rate
  • Previous by thread: PCI audio driver help
  • Next by thread: Re: How to Use AudioQueueOfflineRender?
  • Index(es):
    • Date
    • Thread