Re: Which channels are used for what?
Re: Which channels are used for what?
- Subject: Re: Which channels are used for what?
- From: Jeff Moore <email@hidden>
- Date: Tue, 23 May 2006 16:39:06 -0700
On May 23, 2006, at 4:21 PM, Steve Checkoway wrote:
William Stewart wrote:
Yes, both are variable sized structs
Good. I'm handling this correctly then.
BTW, the output unit handles all of this stuff for you.
I understand. That's what you said in the previous e-mail. You also
said that the output unit does not provide as much information as
the HAL. I really have 3 requirements: I need to play audio with as
little latency as possible. I need to survive device format changes
similarly to how iTunes does if you change the device sample rate
using AMS. I need to know when "now" is as accurately as possible
multiple times per second and at each io proc callback. Using the
audio unit, it seems like I can easily get the first two (well, I
assume the overhead of the output unit doesn't introduce much
latency). Just talking to the HAL, I have working code that handles
the first and third (at least on Tiger, there's nothing I can do to
work around the timing bug I sent so much mail to the list about in
prior systems).
Is there any documentation that answers the questions I posed? I'm
willing to do work to handle this correctly. If no such
documentation exists and I had any other audio device to test, I
could poke and prod it myself but with just the built-in device in
my G5 (and I suppose I could try my G4 as well, but I suspect it
acts roughly the same), there's not a lot to go on. I have two
channels and one stream and the mapping is clear.
IMHO, you can satisfy all your requirements with AUHAL provided you
don't mind calling AudioDeviceGetCurrentTime() in your render
callback to get the current device time. Plus, if you go with the
output AU, you won't have to worry about being resilient against
format changes or doing any of the other drudge work it takes to be a
proper HAL client.
FWIW, many apps that have very serious synchronization requirements
use AUHAL just fine. Examples include QuickTime, Final Cut, and the
DVD Player.
Also from the sound of things, you haven't really had much of a
chance to test with the various audio devices out there and the wacky
stream layouts they have. Using AUHAL will give you some confidence
that your app will do something approaching the right thing without
you having to code up all the edge and corner cases. In the end, it
_will_ save you time when you don't have to track down that esoteric
device that one of your users has that is causing your app to do
something bad.
This is the right one.
Thank you. When I get home, I'll file a documentation bug on the
other one.
You only need to provide packet descriptions if the format is
variable (either mBytesPerPacket or mFramesPerPacket of an ASBD
would be zero). You don't see this for a linear PCM format. An
example would be MP3 or AAC, or the variable bit rate version of
AC-3 (which is rare). Even compressed formats that are constant
bit rate (both of these fields above are non-zero - that is each
packet has the same number of bytes, representing the same number
of frames), do not require packet descriptions - because they are
constant. (For example IMA4 is a constant bit rate compressed
format).
Okay. This is good to know. I will also file a documentation bug
asking to include this as well.
Be aware that it is currently _never_ the case that you will use
packet descriptions with the HAL. The HAL doesn't support them at
all. Rather, when doing IO with a VBR format, the HAL requires that
IO be done one packet at a time which makes packet descriptions
unnecessary.
--
Jeff Moore
Core Audio
Apple
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden