Re: Conceptual question about a sequencer project
Re: Conceptual question about a sequencer project
- Subject: Re: Conceptual question about a sequencer project
- From: Kyle Sluder <email@hidden>
- Date: Sun, 29 Aug 2010 16:45:03 -0700
On Sun, Aug 29, 2010 at 2:57 PM, Patrick Muringer <email@hidden> wrote:
> For my experience, I have a buffer of 256 or 512 samples. Thus, the callback
> is called to fill this number of samples. When I change the bpm, it is
> reflected immediately (512/44100) sec.
In addition to the responsiveness issue, if I recall correctly render
callbacks are executed on a realtime thread with a hard deadline. I
don't know if that deadline varies based on the number of samples
requested, but filling much bigger buffers than 512 samples might
cause you to miss that deadline. Since the audio can only be played at
the sample rate anyway, there's really not much use to queueing up
huge buffers of samples, especially for dynamic audio. (This was
something that took me, also a Core Audio newbie, a while to realize.)
> Yes this is what I have. As said before, the callback is called each time it
> needs samples. Depending on the buffer size you choose, the callback will be
> called more or less often. But it is regular. Regarding thebsync with the
> UI, I set a variable in the call back that reflects the current time. In the
> main thread, I use a Timer that checks this variable (20 times/sec) in order
> to sync with audio. There might be better strategies...
This sounds like an appropriate strategy. Set an NSTimer on your main
(UI) thread, and use that to push the new values from the playback
engine to the view. Don't try to be "insanely realtime" by forcing a
UI refresh every runloop cycle; all you'll do is slow down your
repaint interval by performing a lot of drawing work at the monitor's
refresh rate, and consequently block the main thread while you do it,
thus making your app feel sluggish and unresponsive. A good trick, if
you were doing something similar to a DAW, might be to look at the
time delta that one pixel on your timeline represents, and set your
timer up with that interval.
> By the way, even if this is not audio, for those who already did it, if you
> want to have a kind of matrix representing the ticks/beats/bars on the x axe
> and the tracks/samples on y, matrix in which each "square" represents a tick
> "clickable" by the user to modify the sequence, what would you suggest as
> the best strategy? Buttons? Coordinates calculations in the view to know
> where the user clicked and see to which "square" it corresponds in the
> matrix?
Are you asking about what UI control to use for this? NSMatrix seems
to be right up your alley here.
--Kyle Sluder
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden