Re: audio computation and feeder threads
Re: audio computation and feeder threads
- Subject: Re: audio computation and feeder threads
- From: Kurt Bigler <email@hidden>
- Date: Sun, 02 Mar 2003 16:28:24 -0800
on 3/1/03 7:29 PM, Bill Stewart <email@hidden> wrote:
>
Kurt,
>
>
On Friday, February 28, 2003, at 08:10 PM, Kurt Bigler wrote:
[snip]
>
> I am trying to decide when it is a worthy thing to use a feeder thread in
>
> connection with an IOProc thread. The following thoughts come to mind as I
>
> try to put together my ideas on this, and I would appreciate feedback.
>
>
>
> First of all, the mere fact of outputting synthesized audio apparently does
>
> not in itself appear to constitute a reason for having a feeder thread. I
>
> am assuming (though maybe I am wrong) that Apple's audio units do not have
>
> any multi-thread decoupling/buffering going on in them - particularly audio
>
> units that do synthesis from midi would be the issue here. Can I assume
>
> that the DLS Synth (which I know _absolutely_ nothing about, yet need to use
>
> here as an example) does its synthesis right in the IOProc thread? If yes,
>
> then can I assume that this is therefore an "ok thing"?
>
>
Just to be clear on usage here (and some more details about the synth
>
itself)
>
>
The DLS Synth (like actually, all of our audio units) does all of its
>
work when AudioUnitRender is called, and on the thread that AURender is
>
called on.
>
>
This typically is an I/O proc, but can be any thread - for instance
>
when you want to write the data out to a file.
[snip]
>
CPULoad
[snip]
>
RenderQuality
[snip]
>
I think you should consider supporting both of those properties - this
>
then allows the user to make decisions about your synths usage based on
>
a particular situation.
I see that my mention of the DLS synth confused things a little - I was just
using it as an example.
In my case, I'm writing a specialized synthesis app, not a host app, and not
an AU. At the moment the only AUs I have need for in the app are output
units. This may change if the AUBufferUnit is implemented.
My app will basically be part of a turn-key solution for digital organs. If
Macintosh running OS X doesn't turn out to be the ultimate platform for
production (which might use specialized hardware), it still makes a great
development environment, and will no doubt be used for early production if
not longer. This makes considerations like being friendly to other
processes on my system simply not an issue.
Sorry I wasn't clear enough on this to begin with - it just never occurred
to me. However, it is good to understand all these issues, even if I am not
applying them today, and with all the other readers here none of your words
are wasted.
>
> So, I can think of several reasons to use a feeder thread (together with
>
> appropriate buffering and consequent additional latency) to feed synthesized
>
> audio to an IOProc thread:
[snip]
>
A good (perhaps the only) reason an AU should consider using its own
>
threads are if there are processes within the AU that can be
>
parallelised AND you have more than one CPU to execute on. Some also
>
like to use threads as a way to maintain complex stack states, etc, so
>
that's another good reason.
So in this case, I think your comments apply equally well to a synthesis
application rather than an AU. If not please clarify.
[snip]
>
> (3) to buffer against irregularities in system performance, such as some
>
> "randomness" in the scheduler together with the unpredictable nature of the
>
> demands put on the scheduler
>
>
Hmmm... I think that's a bad reason for the AU at least. This really is
>
a responsibility of the host and ultimately of the OS. It also implies
>
an arbitrary introduction of latency that is generally undesirable.
I think its just too ideal to that things will never go bumpety-bump in the
scheduler - the scheduler is all about bumpety-bump! I don't mean that it
is malfunctioning in any way. Of course this is no issue if you aren't
pushing the CPU to the limit. But if you are: things might run smoothly
for 5 minutes, but then the scheduler just happens to juggle things a
certain way, and you have a glitch in continuity, i.e. you get an overload
or whatever. That's why I say it is statistical.
Thanks for all your comments.
This email was getting too long for the 8K limit of this list. Most of the
rest related more to the buffering AU idea, though - so I started a separate
thread for that - more soon.
Thank,
Kurt Bigler
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.