Re: Minimizing input/output latency ?
Re: Minimizing input/output latency ?
- Subject: Re: Minimizing input/output latency ?
- From: William Stewart <email@hidden>
- Date: Tue, 19 Apr 2005 11:10:04 -0700
In addition to David's suggestion, there is also a new property
available in Tiger called I/O Usage -
kAudioDevicePropertyIOCycleUsage (this is also available on Panther
with QT 7)
What this enables you to do is the following:
When an I/O Proc fires, it today asks you for output data that is:
Now + SafetyOffset + OneBuffer - so you are producing audio for an
output time (OT) which is buffer (BT) + safety offset(ST) time in the
future.
With the I/O Usage property, you can adjust this output time by
specifying some fraction of 1 that you want to cut down your I/O time
by. So, the above formula becomes:
OT = BT * I/OUsage + ST
where the default value of I/OUsage is 1. This does not change the
number of frames of audio in an I/O Proc, just the time at which the
device will be given that data.
For example, if I/O Usage is set to 0.5 (and your buffer size is 512
sample frames), that means the audio data you are producing in a
given I/O proc is for:
OT = 512 * 0.5 + ST
in other words, you are producing 512 frames of audio to be given to
the device at OT of the safety offset + 256 (512 * 0.5) samples in
the future.
Thus, your output latency has now been reduced by 256 samples in this
example.
The trade-off that you make of course, is that you have also halved
the time that you have to produce those samples - so the maximum CPU
usage for that I/O proc would be now 50%
The reason why the HAL's default is the way it is today, is that it
gives a client essentially 100% of the CPU to produce the output
data, because we call you at a time where you have the duration of
the I/O buffer size, before the data *has* to be given to the driver
for it to output. So, this property enables you to reduce the actual
output time to reduce latency.
There is no lower limit on the I/O usage, though of course it can't
be too low to the point where you can't get the output data to the
device in time!, that generates an overload of course.
We've implemented this property in AU Lab (which is the app we're
shipping in Tiger - /Developer/Applications/Audio) - so you can set
up a latency test and see how this can work for you.
Just as a side note - the input data is unchanged by this setting -
it only affects the time for the output data. Input data is always
the most recent data we can give you.
Bill
On 19/04/2005, at 6:23 AM, David Duncan wrote:
On Apr 19, 2005, at 06:15 AM, Stéphane Letz wrote:
It there any way to keep this one buffer delay functioning mode
but still try to reduce the I/O latency? For example "changing"
the value of kAudioDevicePropertySafetyOffset/
kAudioDevicePropertyLatency properties?
Both of the properties you mention are read-only -- they are
typically functions of the device hardware. However, you can reduce
the latency by reducing the size of the buffer requested via the
kAudioDevicePropertyBufferFrameSize property. This will of course
also increase the number of callbacks you receive.
--
Reality is what, when you stop believing in it, doesn't go away.
Failure is not an option. It is a privilege reserved for those who
try.
David Duncan
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden
--
mailto:email@hidden
tel: +1 408 974 4056
________________________________________________________________________
__
"Much human ingenuity has gone into finding the ultimate Before.
The current state of knowledge can be summarized thus:
In the beginning, there was nothing, which exploded" - Terry Pratchett
________________________________________________________________________
__
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden