Re: Work-arounds for driver-level input latency
Re: Work-arounds for driver-level input latency
- Subject: Re: Work-arounds for driver-level input latency
- From: Zachary Schneirov <email@hidden>
- Date: Fri, 4 Sep 2009 14:55:39 -0500
I should also mention that the officially reported (fixed) input and
output latencies and safety offsets are each under ~50 frames (the
same as for devices without this problem), and I can test with IO
buffer sizes as low as 32 frames.
Zach
---------------------------------------
Zachary Schneirov
Northwestern University
On Sep 4, 2009, at 12:03 PM, Jeff Moore wrote:
So I'm confused. The first part of this message sounds like you are
writing a driver for a piece of hardware. Yet, in the second part
of the message you talk about adapting your application. Which are
you doing? A driver? An app? Both? The answer to your questions
really does depend on what you are doing.
Also, I don't see where you are describing what you are measuring
and how you are measuring it. Please be specific! We can't really
help you without knowing the actual details of what you are doing.
There are a lot of ways to do this and get misleading results.
Finally, I will also say that an application really has no control
over hardware latency. The best an app can do is to lower it's IO
buffer size, which has a direct effect on latency at the cost of
having the IO thread run more often, and use
kAudioDevicePropertyIOCycleUsage, which trades time in the IOProc
for lower latency. But there is nothing an app can do or change
about what the driver is doing.
On Sep 3, 2009, at 9:34 PM, Zachary Schneirov wrote:
I'm currently facing the difficult task of achieving low-latency
throughput on a class of USB chipset from C-Media (CM108/109/119)
whose sample timings CoreAudio apparently cannot consistently track.
Problem: Over time (about 5 minutes), frames grabbed from the
input stream become increasingly delayed, often by up to 250 ms.
I'm guessing the HAL's IO engine is underestimating the actual
sample rate of the device, leaving behind some number of frames
during each IO cycle. Likewise, audio is sometimes garbled,
perhaps from overestimating the sample rate and under-running the
driver's buffer (?), though this is less common.
I can observe the effect using a simple HAL IOProc input callback
or with any application that does software playthrough (e.g.,
CAPlayThrough, HALLab's Input window, etc...). On 10.5 and 10.6 I
can reset this latency only by either unplugging the device or
setting the stream format on any section. On 10.4 stopping the HAL
engine for the app seems necessary.
This chipset is common in USB headsets (especially those for
education) and has some desirable qualities (e.g., hardware-
playthrough control), so I'm motivated to adapt my application
(which involves very-low-latency audio conferencing) to work with
it as well as possible.
If knowledgeable CoreAudio people could tell me which of these
work-arounds might set me on the right track or provide better
suggestions, I would be extremely obliged:
a) Avoid AudioDeviceAddIOProc() and instead call
AudioDeviceRead() from a real-time thread with jitter-buffer
semantics, dropping a few frames every now and then
b) Create a user-space HAL driver to manipulate whatever
underlying ring buffer is feeding the HAL IO engine
c) Create an AppleUSBAudio plug-in kext to do the same
d) Set the stream format every few minutes to trigger a reset
(extremely disruptive when playing or recording)
e) Send commands directly to the chipset with IOKitLib to trigger
a reset, aiming for fewer side-effects
--
Jeff Moore
Core Audio
Apple
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden