Re: audio computation and feeder threads
Re: audio computation and feeder threads
- Subject: Re: audio computation and feeder threads
- From: Andy <email@hidden>
- Date: Sat, 1 Mar 2003 12:08:27 +0000
On Saturday, Mar 1, 2003, at 04:24 Europe/London, Lionel Woog wrote:
Typically, an IOProc will be called with a 512 frames request. ...
Of course that is not a rule though: 512 frames seems to be a common
default but the number of frames depends entirely on the stream format
of device, which could be anything; eg the "Built-in audio controller"
can range from 14 to 6144 frames.
Some applications will choose to allow a user to control the frame
length of the IO buffers in order to reduce latency or to optimise
throughput. I would say that alone is a good reason to have a seperate
feeder thread. Some DSP algorithms work better if calculated on larger
buffers, the work would need to be done in advance of pumping through
the IO proc especially if the IO proc was only using say 32 frames.
It's currently my belief that the IOProc should do /nothing/ other than
copy or pass audio samples from the feeder buffers to the HAL buffers.
One should not waste time in the HAL's high priority thread.
I believe that you will find core audio very able at keeping itself
fed.
I am trying to glean from all the information which has gone by on
this list
what I need for a general plan for thread design in my app. ...
...
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.