Re: MIDI events / note jitter / remoteIO buffer
Re: MIDI events / note jitter / remoteIO buffer
- Subject: Re: MIDI events / note jitter / remoteIO buffer
- From: Gregory Wieber <email@hidden>
- Date: Wed, 29 Dec 2010 15:10:25 -0800
With the help of some offline comments, I've found a solution to this problem. Instead of sending my MIDI events at the end of each remoteIO render callback, I've created a separate thread that runs at a faster pace, and I've put that thread in charge of sending out queued midi events. Works well, as far as I can see; now I have MIDI notes lying much closer to the grid when viewed in Logic.
For anyone who may have been following this issue, it means that I'm able to retain a sample accurate sequencer that works within a remoteIO render callback, without having to wait an entire buffer length to send out MIDI events.
On Wed, Dec 29, 2010 at 12:23 PM, Gregory Wieber
<email@hidden> wrote:
Hello,
I have accurately calculated the latency of my application -- let's say it's roughly .025 seconds. I am queuing midi events during my render callback, and then sending these events out. The events have accurate time stamps, which take the latency of my app into account.
The issue that I'm having is that when I record these MIDI events into a program like Logic, I can measure a jitter in the notes that is equal to the latency of the app. That is, a lot of notes are off by .025 seconds.
My ability to comprehend the situation is hitting a bit of a wall, mainly because the concept of time is harder for me to grasp when the size of the buffer and latency come into play. My guess is that some of the midi events fall too close to the 'edge' of the buffer, and therefore are late because an entire buffer needs to be sent along before that midi event ever gets sent. This doesn't make total sense to me though, because although there's an inherent latency to the buffer system, when one buffer is done playing the next is ready (otherwise you'd hear gaps in the audio --eg, the audible tones my app makes). So, why am I getting time-gaps in my midi?
Is there a way to avoid this 'note jitter' somehow?
Creating a music sequencer by counting the timestamps in render callbacks seemed to be the recommended approach, but now I'm wondering if it would be better to create a separate real-time thread. I'm resistant, because I have a lot of code invested in my current approach, and I've read that there are accuracy issues with the thread approach as well, and I might already be getting the most accuracy I'm going to get...
best,
Greg
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden