• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: When has playback finished?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: When has playback finished?


  • Subject: Re: When has playback finished?
  • From: William Stewart <email@hidden>
  • Date: Fri, 17 Mar 2006 11:26:04 -0800

Norman,

Your question was also answered when you asked it.

You are in control of the render process here. You are using an AudioConverter to convert data as you describe. What do you do when you have no more input data at a particular point in time? For the AC you return 0 input and noErr from your AC Input proc - this is a signal that you have no more input data to process. Your code is calling AudioConverterFillComplexBuffer, so when your input returns this, your FCB call is going to get a signal that here is the last of the output that I have for you for the input. So, you then have to turn around and provide this output (which might be a partial buffer) to the render callback. That's when you are finished playing - at the end of that I/O cycle. What you do then is up to you and how you want to handle this condition.

Bill


On 16/03/2006, at 8:35 PM, Norman Franke wrote:

I asked this some time ago, and didn't get a response. Forgive me if this is really simple...

I'm getting into Core Audio, and a one thing is perplexing me. I'm playing a generated sound via the default Audio Unit. I set a render procedure via kAudioUnitProperty_SetRenderCallback which then calls AudioConverterFillComplexBuffer to convert from the native format I'm using to whatever I'd like to play. This all works pretty well with one exception: how do I know when the sound has finished playing?

I can attempt to count samples after the call to AudioConverterFillComplexBuffer and translate that into a time, but how do you even determine when the sound started to play? The render procedure has as an input a AudioTimeStamp, but the documentation says that is the current timestamp, which is really useless.

It's important to not terminate playback before all samples are finished, and also important to not play too long. How can this be done? I've spent several hours googling and searching through the mailing list archives, to no avail.

The sounds in question can range from a second to several minutes long.

-Norman


_______________________________________________ Do not post admin requests to the list. They will be ignored. Coreaudio-api mailing list (email@hidden) Help/Unsubscribe/Update your Subscription: This email sent to email@hidden

--
mailto:email@hidden
tel: +1 408 974 4056
________________________________________________________________________ __
"Much human ingenuity has gone into finding the ultimate Before.
The current state of knowledge can be summarized thus:
In the beginning, there was nothing, which exploded" - Terry Pratchett
________________________________________________________________________ __


_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


References: 
 >When has playback finished? (From: Norman Franke <email@hidden>)

  • Prev by Date: Re: useful return values for Audio Unit callbacks?
  • Next by Date: Re: Some AudioUnits crash WhackedTV sample code
  • Previous by thread: When has playback finished?
  • Next by thread: Re: SampleAccurate Tempo
  • Index(es):
    • Date
    • Thread