Re: MusicEventIterator questions
Re: MusicEventIterator questions
- Subject: Re: MusicEventIterator questions
- From: Paul Davis <email@hidden>
- Date: Tue, 19 Mar 2013 10:05:12 -0400
On Tue, Mar 19, 2013 at 7:42 AM, Ross Bencina
<email@hidden> wrote:
Hi Aran,
I agree with Paul that if the idea is to have a single mutable data structure that is accessible from multiple threads that's kind of hard.
But the way I would do it is have two single-threaded data structures and propogate changes via a FIFO:
there is an alternate approach too that also uses a FIFO. the "audio" thread (running in an RT context, the one where rendering is done) doesn't need to know about anything that, well, anything that it doesn't need to know about. it only cares about "now", meaning specific data required to render audio for the current block of time.
so you can leave your fancy-pants data structure in non-RT land, and just make sure that a "linearized" version of gets pushed into a FIFO where the audio thread can pull from. that way, you can use mutexes or whatever synchronization mechanisms you want in non-RT land, and the audio thread just gets a simplified, accurate, "windowed" snapshot of it to use for rendering.
of course, this gets complex when you jump around on the timeline and have to refill the FIFO correctly, and in this sense, Ross' idea has a different kind of simplicity on its side.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden