• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Another Audio Units/"playing now" question
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Another Audio Units/"playing now" question


  • Subject: Re: Another Audio Units/"playing now" question
  • From: William Stewart <email@hidden>
  • Date: Thu, 28 Oct 2004 12:38:12 -0700

You really have to have an external reference to do this... An AU on its own
has no real idea of any external time references with regards to when its
audio is being rendered for - its all just relative sample counts in large
part.

For instance, QT solves this problem by having clock objects that then
dictate which audio gets fed for a particular audio I/O cycle and
synchronises video presentation based on that same audio clock. This model
also lets you take into account presentation latencies of both video and
audio devices.

Bill

On 28/10/04 12:19 PM, "Simon Fraser" <email@hidden> wrote:

> Folling up from my previous post about how to get at the timestamp of
> the audio that is currently playing (which I didn't really resolve),
> I have another related question.
>
> To recap, I'm playing audio with an Audio Unit which has to sync to
> video, so it's important that I can get some kind of timestamp for
> the audio that's currently playing. I'm feeding PCM data to the audio
> unit via a render callback, which in turn uses an AudioConverterRef to
> potentially do upsampling.
>
> That render callback gets a timestamp which appears to contain the
> same absolute time that I'd get by calling AudioGetCurrentHostTime()
> (i.e. something that's not too useful in this context).
>
> The user can pause audio/video playback, which I implement via
> AudioOutputUnitStop()/AudioOutputUnitStart(). However, I now have to
> take into account "paused" time when generating timestamps to sync with
> the video, which seems clumsy. So I'm still looking for some way to
> ask the Audio Unit (or some underlying API) how much time it has
> spent playing audio since the last 'start' call.
>
> Is there any other way to do this, other than to perhaps track the
> number of samples supplied in the callback, and convert to time based on
> the sample rate?
>
> Any input, or suggestions for different ways to approach this, are
> appreciated.
>
> Thanks
> Simon
>  _______________________________________________
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list      (email@hidden)
> Help/Unsubscribe/Update your Subscription:
>
> This email sent to email@hidden

--
mailto:email@hidden
tel: +1 408 974 4056

__________________________________________________________________________
Culture Ship Names:
Ravished By The Sheer Implausibility Of That Last Statement [GSV]
I said, I've Got A Big Stick [OU]
Inappropiate Response [OU]
Far Over The Borders Of Insanity And Still Accelerating [Eccentric]
__________________________________________________________________________



 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

References: 
 >Another Audio Units/"playing now" question (From: Simon Fraser <email@hidden>)

  • Prev by Date: Another Audio Units/"playing now" question
  • Next by Date: Re: AUEventListeners dies
  • Previous by thread: Another Audio Units/"playing now" question
  • Next by thread: iSight audio format for converter
  • Index(es):
    • Date
    • Thread