Re: Multiple Stereo streams vs. multi-channel stream
Re: Multiple Stereo streams vs. multi-channel stream
- Subject: Re: Multiple Stereo streams vs. multi-channel stream
- From: Jeff Moore <email@hidden>
- Date: Wed, 30 Apr 2003 16:52:04 -0700
Unless each stream is clocked independently (and nothing you've said
makes me think that this is the case), there is no reason to have
multiple engines. I would implement your device with one engine that
has multiple stereo streams. That seems the most natural way to do
things from my point of view.
There are already devices out there on the market that have unusual
stream layouts. For instance, the MotU 828 has 3 streams in each
direction. The first has 8 channels, the second has 2 and the third has
8. The Metric Halo MobileIO has 18 mono input streams and 8 mono output
streams. There are apps out there that don't do handle these devices
too well (some crash outright). But, this is a problem with the app,
not with the way the devices are presenting themselves to the system.
The reason why is because it is up to each client of the HAL to handle
whatever stream layout a given device chooses to present.
So, my recommendation is to go with what is natural like what Bill said.
On Wednesday, April 30, 2003, at 04:17 PM, BlazeAudio Developer wrote:
Bill,
Thank you for the detailed response.
The "most natural" way is to represent our device as having 8 Output
Audio Engines and 8 Input Audio Engines. Whereas each of them has one
stereo (not mono) stream.
Unfortunately, this does not work well with most of the programs that
we
have tested.
Programs will only work with one audio-engine at a time.
That's why we've created 9 engines - 8 engines each having one input
and
one output stream (stereo). And a ninth engine that has 8 input and 8
output streams (also stereo). As I said this works well with Logic, but
not with Cubase SX!
Thanks.
Devendra.
At 12:10 AM 4/30/2003, Bill Stewart wrote:
The design intention of the HAL in general was to allow the device
to
publish itself in a format that is most efficient and thus provides
the
ability for the driver to do as little work as possible (this is
important both in terms of CPU usage that is incurred when an app
interacts with the driver, and also affects resource issues like
memory
usage where kernel/wired memory is an expensive resource)
Thus, what is the most natural way of representing your device to an
application. It sounds to me that this is a set of mono streams -
this
is actually a very efficient way of publishing the device's channels.
Why? Clients of the device can choose which channels they are using
and
can turn off the streams of the device that aren't in use. We are
doing
some of this already in Jaguar, but not as much as we should - with
the
output units and/or the SoundManager. We will have this working
completely in the next OS release. This can make a considerable
difference to the CPU usage.
Both the SoundMgr and the output units already deal correctly with
these mono streamed devices. The Mobile I/O and some other drivers,
publishes channels as mono streams, and I think most of the apps are
working with this device now (there were some teething problems but
I
think all of these have been fixed).
Bill
On Tuesday, April 29, 2003, at 11:05 PM, BlazeAudio Developer wrote:
We have a device whose hardware/DMA is designed for multiple
stereo
(interleaved) streams.
The first approach we took with our driver was: create 8 audio
engines
-
each having one stereo stream.
This works fine, but most programs will only work with one
audio-engine
at a time.
So, we took another approach - we created a single audio engine,
with 8
stereo streams (the hardware supports treating all 8 DMA engines
as
one -
so we don't really have a sync. issue).
This scheme works well with Logic Platinum 5.5.
However, the driver breaks with other software like Cubase SX and
Spark
ME.
Is this a known problem?
What's the best approach to solving this?
One alternative is to do double-buffering. Create a single
multi-channel
stream, and then distribute the data coming from CoreAudio to the
8
separate DMA buffers. But it appears like not quite the right
thing to
do.
Thanks.
Devendra.
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.
--
mailto:email@hidden
tel: +1 408 974 4056
_______________________________________________________________________
_
__
"Much human ingenuity has gone into finding the ultimate Before.
The current state of knowledge can be summarized thus:
In the beginning, there was nothing, which exploded" - Terry
Pratchett
_______________________________________________________________________
_
__
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.
--
Jeff Moore
Core Audio
Apple
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.