Re: clarification on RME suggestions ...
Re: clarification on RME suggestions ...
- Subject: Re: clarification on RME suggestions ...
- From: "B.J. Buchalter" <email@hidden>
- Date: Fri, 20 Sep 2002 10:36:21 -0400
Hi Marcus,
>
I'm no expert on CoreAudio, but I feel using multiple streams to avoid
>
interleaving channels is a very complicated solution.
>
>
While reading the CoreAudio headers in Jaguar I stumbled across a new
>
constant, kAudioFormatFlagIsNonInterleaved, which seems to allow you to have
>
*one single stream* containing for example 18 channels, without having to
>
interleave them. Is this correct?
But this is essentially the same as having N streams, except now the buffer
boundaries are implied rather than explicit.
Really -- the multi mono stream case is EXACTLY like ASIO and definitely the
simplest model and the fastest for sample blitting (due to cache coherency,
etc.). The reason this was not imposed upon the drivers is that it does not
match the underlying DMA buffers well for a lot of hardware, and Apple
decided that it was best to have the least amount of work done in the driver
(and the kernel) where it was difficult to debug and expensive to maintain,
and instead exported the management of stream interleaving to the CA clients
who could simply use Apple's AU technology if they did not want ot deal with
the stream buffer management. But for any app that was written to deal with
ASIO (most Pro apps coming from OS 9) the multi-mono model is the simplest
port. If you have already dealt with the stereo stream case then the
multi-channel interleaved case is a very simple extension.
>
>
Slightly off-topic: This is embarrassing, but I have still to understand why
>
the concept audio-streams exists at all in CoreAudio. To support sound cards
>
with channels that use different sample rates simultaneously?! To have
>
different types of compression on different audio channels simultaneously?
>
Please fill me in if I've missed something.
To represent the underlying DMA buffer structure of the device so that the
Driver does not have to deal with shuffling the samples around to match some
abstract CA buffer structure to its underlying buffer structure. This is
because operations in the kernel are expensive, and very difficult to debug
(compared to userland implementations of the same thing).
And, as Bill said:
>
> With Jaguar you will be able to turn streams on and off, and this will
>
> certainly make a difference for efficiency... So, if you have a device like
>
> the RME card, that publishes multiple mono streams, but you're only using 8
>
> of them say, then you will be well advised to turn off the streams that the
>
> user isn't using in any given context.
>
>
>
> This maybe some more UI for you (and this property can be adjusted
>
> dynamically) - but - your app then gives back to the user valuable CPU time
>
> to do more of what she wants to do...
Since this is a per stream property, it does not really help if all the
channels are packed into one stream.
Best regards,
B.J. Buchalter
Metric Halo
M/S 601 - Building 8
Castle Point Campus
Castle Point, NY 12511-0601 USA
tel +1 845 831-8600
fax +1 603 250-2451
If you haven't heard ChannelStrip yet, you don't know what you're missing!
Check out SpectraFoo, ChannelStrip and Mobile I/O at
http://www.mhlabs.com/
Download a 12 day demo from <
http://www.mhlabs.com/demo/>
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.