AudioOutputUnit and timestamps
AudioOutputUnit and timestamps
- Subject: AudioOutputUnit and timestamps
- From: Kurt Revis <email@hidden>
- Date: Sat, 27 Apr 2002 01:29:02 -0700
I'm using the DefaultAudioOutput AU to play a sound from a file. I
provide the AU an input callback function, using
AudioUnitSetProperty(kAudioUnitProperty_SetInputCallback).
It's basically working fine--the sound plays correctly--but now I want
to use the timestamps that my input callback is given. I don't quite
understand what they're telling me.
* My sound format is 16 bits per sample, stereo, 44.1k. This means that
there are 4 bytes per frame. The input callback is given a buffer which
is 2048 bytes in length, which is thus 512 frames (or 1024 samples).
However, in the timestamps that I am given, the only valid field is
mSampleTime, and it increases by only 256 each time my callback is
called. I would expect 512 (or maybe even 1024, although that's
stretching it)--why is it different?
* It doesn't seem that I can translate this sample-based timestamp to a
host time. If I run the timestamp through AudioDeviceTranslateTime(),
using the correct device, I get a host time which is nowhere near the
value that AudioGetCurrentHostTime() returns.
(I suppose I understand why--the mSampleTime starts at 0 when the AU is
started, but the device's sample timebase doesn't. But is there some
other way to do what I want?)
The AudioOutputUnit interface is nice and easy to use, but it seems
awfully limiting to not be able to find out when a given buffer of data
will actually get played. (Not to mention, contrary to the whole
CoreAudio philosophy...)
Thanks in advance for any information or advice.
--
Kurt Revis
email@hidden
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.