Re: Synchronizing with the actual USB device audio clock
Re: Synchronizing with the actual USB device audio clock
- Subject: Re: Synchronizing with the actual USB device audio clock
- From: Jeff Moore <email@hidden>
- Date: Tue, 20 Dec 2005 12:03:42 -0800
The way the HAL knows when to call the IOProc is based entirely on
the time stamps the driver provides. I have discussed driver time
stamps and their importance on this list many times. It has also been
the topic of at least two of my talks at WWDC. In fact, it is the
heart of the IO model we use for audio. Anybody working with hardware
at the driver level really needs to understand how this stuff works
rather intimately. You should review as much of the material as you
can find, very thoroughly.
To answer your question about USB, the reason why bi-directional USB
Audio Class compliant devices are split into two devices is because
the USB Audio Class defines them that way. The input side and the
output side are specified to be clocked independently of each other.
They can even run at different sample rates. The HAL defines a
"device" to be a single clock and all the data streams synchronized
to it. Thus, USB Audio Class compliant devices have to be split
apart. Obviously, if you have hardware that can make stronger
guarantees about the clock, you can write your own driver that
presents the device fully muxed.
As far as the data having "the average actual USB audio data flow
rate in both direction is exactly the device sample rate" goes, I
find the notion unlikely. For output at least, the device is at the
mercy of the clocking of the USB controller on the CPU that is
sending the data. It is unlikely that this controller is going to be
running at exactly the rate your device considers correct or exactly
what the device sends back to the CPU on the input stream. It'll be
very close, but I think it would be unlikely to have all these
figures be exactly the same.
It sounds like you are trying to figure out how to approach some
problem. Perhaps you should go over what the problem is and we can
help direct you to a way to finding the solution.
On Dec 20, 2005, at 8:05 AM, philippe wicker wrote:
Could anyone explain me how the HAL is synchronizing the period to
which it calls the IOProc to the actual audio clock of the USB
device? To make my question a little bit clearer, let's assume a
theoretical device with only a mono ouput audio channel at a fixed
(non adaptative) and exact 44.1 KHz sample rate. This device would
need exactly 44100 samples each second to feed its DAC. How does
the HAL proceeds to provide no more no less than this (average)
amount of samples?
If this device had also the capacity to sample analog audio at the
same exact frequency, it should be requested by the USB host to
send an exact average 44100 samples each second. I've read here
and there that a USB "in/out" device was split into 2 devices by
the HAL. What about the average actual USB audio data flow rate in
both directions?
On the very short term, I'm preparing the debug of the audio part
of the hardware. The first step is to validate the audio streaming
between the host and the USB interface. The basic idea is to
implement a hardware audio loop back, ie to copy the audio received
in the OUT ISO endpoint into the IN ISO endpoint. My hardware can
do that easily providing that the average actual USB audio data
flow rate in both direction is exactly the device sample rate. I
think IN and OUT data flow rate are both locked to the device audio
clock but wants to be sure...
--
Jeff Moore
Core Audio
Apple
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden