• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Work-arounds for driver-level input latency
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Work-arounds for driver-level input latency


  • Subject: Re: Work-arounds for driver-level input latency
  • From: Jeff Moore <email@hidden>
  • Date: Fri, 4 Sep 2009 13:06:13 -0700


On Sep 4, 2009, at 12:49 PM, Zachary Schneirov wrote:

Jeff,

I'm writing an audio conferencing application but am willing to augment it with a driver/plugin if that could improve the issue. I realize this is beyond what most applications are expected to care about, but in the case of this app these devices are used with it almost exclusively, so my scope is expanded by necessity. This is commodity hardware and I don't have full control of it, but at a low enough level controlling the input latency seems almost possible. So the answer to your first question is probably: Both.

In this case, the basic answer is what I said previously. There isn't anything you can do in this regard to lower the latency beyond adjust your IO buffer size and set kAudioDevicePropertyIOCycleUsage. There are no tricks hiding here.



Regarding measurements: because I can't know when the hardware itself is actually reading any given sample from the mic, timing was done three different ways: a) wiring output to input and comparing timing between played-out pulses, b) sending audio to another computer on the network (with a known network latency), c) using software playthrough on the same computer (with the latter being mostly perceptual). Obviously this doesn't completely isolate input latency from output latency, but exact measurements aren't that useful anyway as the latency is always increasing over time. On some machines the total input + output latency can reach half a second.

Input latency is close to zero when the device is first connected (or after any of its stream formats are set), and builds quickly from there. It seems that the device either is not providing accurate timings for its samples or has a sample rate that is too dynamic for CoreAudio to track; regardless, and despite not actually working for the manufacturer, I want to get the lowest latency possible for my application.

Running a loop-back test like this that shows an increasing amount of latency implies that the input device and the output device are drifting relative to each other. This implies that they are separate devices. Presuming this is true, it raises a bunch of questions about how you are handling inter-device synchronization:
- Are you using an aggregate device and if so how is it configured (including such info as what devices are in the aggregation, which device is the master, which devices have resampling enabled, etc.)?
- If you are not using an aggregate device, how are you dealing with synch?


If this is a bidirectional device and you are seeing drift between the input and the output, it probably means that the hardware internally isn't synchronized. If this device uses our built-in USB Audio class driver, it would probably be a good idea to file a bug about this.




---------------------------------------
Zachary Schneirov
Northwestern University

On Sep 4, 2009, at 12:03 PM, Jeff Moore wrote:

So I'm confused. The first part of this message sounds like you are writing a driver for a piece of hardware. Yet, in the second part of the message you talk about adapting your application. Which are you doing? A driver? An app? Both? The answer to your questions really does depend on what you are doing.

Also, I don't see where you are describing what you are measuring and how you are measuring it. Please be specific! We can't really help you without knowing the actual details of what you are doing. There are a lot of ways to do this and get misleading results.

Finally, I will also say that an application really has no control over hardware latency. The best an app can do is to lower it's IO buffer size, which has a direct effect on latency at the cost of having the IO thread run more often, and use kAudioDevicePropertyIOCycleUsage, which trades time in the IOProc for lower latency. But there is nothing an app can do or change about what the driver is doing.


On Sep 3, 2009, at 9:34 PM, Zachary Schneirov wrote:


I'm currently facing the difficult task of achieving low-latency throughput on a class of USB chipset from C-Media (CM108/109/119) whose sample timings CoreAudio apparently cannot consistently track.


Problem: Over time (about 5 minutes), frames grabbed from the input stream become increasingly delayed, often by up to 250 ms. I'm guessing the HAL's IO engine is underestimating the actual sample rate of the device, leaving behind some number of frames during each IO cycle. Likewise, audio is sometimes garbled, perhaps from overestimating the sample rate and under-running the driver's buffer (?), though this is less common.

I can observe the effect using a simple HAL IOProc input callback or with any application that does software playthrough (e.g., CAPlayThrough, HALLab's Input window, etc...). On 10.5 and 10.6 I can reset this latency only by either unplugging the device or setting the stream format on any section. On 10.4 stopping the HAL engine for the app seems necessary.

This chipset is common in USB headsets (especially those for education) and has some desirable qualities (e.g., hardware- playthrough control), so I'm motivated to adapt my application (which involves very-low-latency audio conferencing) to work with it as well as possible.
If knowledgeable CoreAudio people could tell me which of these work-arounds might set me on the right track or provide better suggestions, I would be extremely obliged:


a) Avoid AudioDeviceAddIOProc() and instead call AudioDeviceRead () from a real-time thread with jitter-buffer semantics, dropping a few frames every now and then
b) Create a user-space HAL driver to manipulate whatever underlying ring buffer is feeding the HAL IO engine
c) Create an AppleUSBAudio plug-in kext to do the same
d) Set the stream format every few minutes to trigger a reset (extremely disruptive when playing or recording)
e) Send commands directly to the chipset with IOKitLib to trigger a reset, aiming for fewer side-effects


--

Jeff Moore
Core Audio
Apple



_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


_______________________________________________ Do not post admin requests to the list. They will be ignored. Coreaudio-api mailing list (email@hidden) Help/Unsubscribe/Update your Subscription: This email sent to email@hidden


--

Jeff Moore
Core Audio
Apple



_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


  • Follow-Ups:
    • Re: Work-arounds for driver-level input latency
      • From: Zachary Schneirov <email@hidden>
References: 
 >Work-arounds for driver-level input latency (From: Zachary Schneirov <email@hidden>)
 >Re: Work-arounds for driver-level input latency (From: Jeff Moore <email@hidden>)
 >Re: Work-arounds for driver-level input latency (From: Zachary Schneirov <email@hidden>)

  • Prev by Date: Re: Work-arounds for driver-level input latency
  • Next by Date: Re: Work-arounds for driver-level input latency
  • Previous by thread: Re: Work-arounds for driver-level input latency
  • Next by thread: Re: Work-arounds for driver-level input latency
  • Index(es):
    • Date
    • Thread