DefaultOutputDevice timestamps
DefaultOutputDevice timestamps
- Subject: DefaultOutputDevice timestamps
- From: email@hidden
- Date: Sun, 7 Oct 2001 05:15:38 -0500 (CDT)
Hey,
I'm trying to figure out how to write different signals as output in a
good synchronized manner but I've hit a snag. For the AudioTimeStamp
inputTime and outputTime I receive as parameters to the IOProc I give the
default output device, the mHostTime field doesn't seem to really work.
Judging from the NSLogs made while writing to the defaultoutputdevice, the
mHostTime rarely ever changes. I just ran it where the initial input
sample time was 563.000000 and the input sample time where the mHostTime
(for both input and output) changed was 6605737.000000 - that's 2 and 1/2
minutes given 44.1Khz sample rate. The mHostTime went from 1224094414600
to 1228389381896 which begs the question (and I know I should know this) -
what exactly is the unit the mHostTime measured in and what is its
starting reference point (0) based on? In other words, is it based on the
unix clock (1971) or is it prone to rollover, etc.?
What is it that I'm doing wrong to get mHostTime
to not update? I use the values the default output device gives me in the
callback method/IOProc that I give it. Since we're supposed to use the
inputTime and the outputTimes as the clock references for our audio I'd
like to figure out how to get it to update properly (and what units it's
in) as I want to try using it for resynchronizing waveform generators and
wavetable readers.
Thanks,
Ben
p.s. Sorry for any poor form but it's 3am.