• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Audio threads scheduling
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Audio threads scheduling


  • Subject: Re: Audio threads scheduling
  • From: Stéphane LETZ <email@hidden>
  • Date: Fri, 2 Apr 2004 22:50:14 +0200

On 2 avr. 04, at 21:24, William Stewart wrote:


On 02/04/2004, at 1:47 AM, Stiphane Letz wrote:
I wouldn't stress to hard over it. MillionMonkeys was originally
written to help stress out the system, all the while trying to simulate
the different strategies a real world audio engine might use for it's
threading. One of the lessons we learned from it is that having your
feeder threads be real-time isn't normally a good thing.

Buy why?
Actually using fixed priority 63 for the feeder threads does not work so well...
And using real-time 96 for the feeder threads with properly choosen values for the computation parameter work quite well and solve interleaving problems.

So what are the *technical* issues to prevent using real-time feeder threads?

The problem is that they are essentially the highest priority running on the system. So, you then start to compete with yourself for doing I/O tasks (like the I/O thread of CoreMIDI as well as Audio Device I/O).

The question really is why doesn't P=63 work? It should. If it doesn't, then we'd like to get bug reports, reproducible scenarios, so we can have these issues addressed. We've had these kinds of complaints in the past, but never been able to get reproducible examples, so we can't fix the problems. We can't get these issues fixed on vague complaints - they don't listen to us - with good reason, as often the problem is not what you think it is.

Any help here will be appreciated

Bill

Basically I'm testing the MillionMonkeys application.

- using the "occurs production in the feeder thread" in "fixed prority" mode with a small buffer size, like 64 frames, 40% CPU load, one get occasionnal drop out when launching applications for example.

- to test real-time thread interleaving, I use 2 copies of the MillionMonkeys application:

- with same buffer size settings (64). Using the "occurs production in the feeder thread" in "fixed prority" mode , 40% CPU load for both applications. One get occasionnal drop out when launching application for example.

- with different buffer size settings (64 and 512 for example). Using the "occurs production in the feeder thread" in "fixed prority" mode , 40% CPU load for both applications. One get constant drop out when the application that use the larger buffer size (512) in on front and occasionnal drop out when the application that use the smaller buffer size (64) is in front.

- with "occurs production in the feeder thread" in "real time" mode, with different buffer size settings (64 and 512 for example) they are drop out with the way the real time thread computation parameter is computed in the given code (that 15% of the buffer time slice defined setThreadToPriority ). If the computation parameter is computed the way it is done in the IO thread (that is larger values for small buffer sizes), things are going well (no more drop out) as explained in the previous mail.


The problem is that they are essentially the highest priority running on the system. So, you then start to compete with yourself for doing I/O tasks (like the I/O thread of CoreMIDI as well as Audio Device I/O).

Sorry but I don't understand this way of reasonning!

You're not competing with yourself , you're trying to get several real-time threads with different needs (period, CPU need..) be scheduled in a proper way.

They are a set of real-time threads with different period and CPU need, and the issue is to give the scheduler informations to help it interleaving thread computation to meet the thread deadline. The way it is done on Darwin , with a unique 97 priority *and* a way to describe the period, computation, and constraint parameter seems more adequate compared to Linux for example where one need to play with thread priority to solve interleaving issues: real-time threads with a smaller period need a higher priority to correctly preempt real-time threads with a larger period.

But in any case we have 100 % CPU time and the issue is to devide this time correctly. Of course some applications like Direct to disk systems can use additionnal lower-prority (for example fixed 63). But when they are required to produce audio at a certain rate, like in the MillionMonkeys application it seems that using a fixed 63 thread is not adequate.

I think that the questions about threads scheduling come again and again because it is a very hard to get *precise* informations about the way things are done in the HAL. For example:

- how are computed the "computation" parameter for proper audio thread interleaving?
- how is computed the "computation" parameters for Midi threads, how does this interact with audio threads.?
- why is it "forbidden" to suspend a real-time thread?

Having this kind of information would be very helpful.

Thanks

Stephane letz
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.


  • Follow-Ups:
    • Re: Audio threads scheduling
      • From: Jeff Moore <email@hidden>
References: 
 >Re: Audio threads scheduling (From: Stéphane Letz <email@hidden>)
 >Re: Audio threads scheduling (From: William Stewart <email@hidden>)

  • Prev by Date: Re: MIDI Lyrics
  • Next by Date: Re: MIDI processing question
  • Previous by thread: Re: Audio threads scheduling
  • Next by thread: Re: Audio threads scheduling
  • Index(es):
    • Date
    • Thread