• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
AudioTimeStamps between devices
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

AudioTimeStamps between devices


  • Subject: AudioTimeStamps between devices
  • From: Ethan Funk <email@hidden>
  • Date: Thu, 11 Nov 2004 12:13:56 -0700

I need to get audio input sample date from one device, process it, and pass it out to a different device. The devices will have different clock sources, so the sampling will not be exactly in sync. I have been playing around with ComplexPlayThru and now have a bunch of questions:

Is the following correct:
1. <TimeStamp>.mSampleTime = first sample number in buffer relative to the *audio device* clock. (always in sequence with out skips)
2. <TimeStamp>.mHostTime = host time stamp of the first sample in buffer relative to the CPU core clock. (may exhibit buffer to buffer drift relative to mSampleTime)
3. <TimeStamp>.mScalarRate = average (over some unknown time span?) deviation of the audio device sample clock rate from the expected sample rate relative to the CPU core clock. For example if the expected rate is 96,000 sps and the device clock is off by 0.01% (faster) relative to the CPU clock, the mScalarRate will settle out at a value of 1.0001 which corresponds to a sample rate of 96,009.6 sps.


Assuming I have all the above stuff correct, I can then sync my two devices by getting samples from the source device, taking note of the mScalarRate in the time stamp, and pass the samples along to the output device through a VarRate filter. When doing a render for the VarRate, I would check the mScalarRate of the source and the mScalarRate of the destination, divide the two and set the VarRate to this playback rate value. I'm skipping over the ringbuffer, base line sample rate differences and latency stuff here to get at the heart of the syncronization problem.

Am I using the VarRate filter properly here? Can it be updated during a render callback? Or should be be writing my own decimation/interpolation code.

Thanks all,
Ethan...

_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


  • Follow-Ups:
    • Re: AudioTimeStamps between devices
      • From: Jeff Moore <email@hidden>
  • Prev by Date: Re: Setting the device volume
  • Next by Date: Re: AudioTimeStamps between devices
  • Previous by thread: Multiple play
  • Next by thread: Re: AudioTimeStamps between devices
  • Index(es):
    • Date
    • Thread