Re: iPhone RemoteIO timeline strange behavior
Re: iPhone RemoteIO timeline strange behavior
- Subject: Re: iPhone RemoteIO timeline strange behavior
- From: Zachary Kulis <email@hidden>
- Date: Thu, 19 Nov 2009 12:49:13 -0500
Jeff - you are absolutely correct. It should be divided by 16000 --
please excuse my typo.
Zach
Your math around the sample time looks wrong to me, but perhaps there isn't enough context here to judge.
If you are looking for the number of samples since an anchor point, you'd simply subtract the current time in samples from your anchor time in samples. I don't understand why you are also multiplying by the sample rate here. If you wanted to convert that into a duration in seconds, you'd actually want to divide by the sample rate, not multiply.
The calculation for converting mach_absolute_time to nano seconds seems correct, though.
So, perhaps you ought to explain how you are deriving this calculation and what it is for, because it doesn't seem to me to be measuring elapsed time in samples.
On Nov 19, 2009, at 9:09 AM, Zachary Kulis wrote:
I am trying to synchronize input (recorded) and output audio samples
using the RemoteIO timeline. Unfortunately, I am getting inconsistent
timestamp information between the mSampleTime field and the mHostTime
fields in the AudioTimeStamp struct.
My first question is: does the mSampleTime provided in the input and
output callbacks refer to the same audio timeline? Below, I outline the
code I am using to produce the issue...
Since the RemoteIO input callback is always invoked before the output
callback, I simply set sampleTime0 equal to the first input timestamp
(mSampleTime). I do the same thing for hostTime0 using the first host
timestamp. For each subsequent callback, I compute elapsed times using
these initial timestamps. This code is shown below (assuming 16kHz
sampling):
sampleTimeElapsed = (sampleTimeCurrent - sampleTime0)*16000
hostTimeElapsed = AudioUtils::elapsedTime(hostTimeCurrent - hostTime0),
where elapsedTime() is defined below (I know that this function works
correctly).
Float64 AudioUtils::elapsedTime(UInt64 elapsed) {
mach_timebase_info_data_t info;
mach_timebase_info(&info);
elapsed *= info.numer;
elapsed /= info.denom;
return (elapsed*1e-9);
}
The problem is that the results are inconsistent. On the iPhone (2.2.1),
the elapsed host and sample times for the input are roughly equal, but
drift apart as time proceeds (to be expected). However, on the
simulator, the elapsed input timestamps (sample vs host) do not agree at
all. Also, for both platforms, the elapsed output timestamps (sample vs
host) do not agree.
I am only trying to measure the delay between the input and output. I am
currently quite confused as to the cause of these inconsistencies. Any
insight would be greatly appreciated.
|
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden