AudioQueueGetCurrentTime is behaving unexpectedly for me, and it behaves differently on the two computers I have here.
The documentation says, of the outTimeStamp: The mSampleTime field is in terms of the audio queue's sample rate, and relative to the time at which the queue has started or will start.
Pretty clear. I'd expect it to increase monotonically, with zero being the instant at which the queue started; and to increase at exactly the same rate at which the queue is processing input frames. But it ain't so.
* On my Mac Mini G4, the current time runs faster than the queue's input frame rate. In other words: in my buffer callback I call AudioQueueGetCurrentTime, and I also examine the value returned from AudioQueueEnqueueBufferWithParameters in the outTimestamp parameter. At first the outTimestamp is a few seconds ahead, as I'd expect (since the samples are being buffered and won't be played for a few seconds.) But as time goes on the currentTime starts to catch up, and after a minute it's, nonsensically, ahead of the outTimestamp, which would mean that the buffers are being played before I queue them!
* On my MacBook Pro, the rate seems OK, but connecting or disconnecting the headphone jack while the queue is running causes the current-time's mSampleTime to reset to about -218424. So it's definitely not "relative to the time at which the queue has started".
The documentation must be wrong, I think. How can I get what I want, the current time of the queue in the same units as outSampleTime?
—Jens |