Re: latency of MusicDeviceMIDIEvent and sampler device on iOS
Re: latency of MusicDeviceMIDIEvent and sampler device on iOS
- Subject: Re: latency of MusicDeviceMIDIEvent and sampler device on iOS
- From: Hamish Moffatt <email@hidden>
- Date: Tue, 18 Mar 2014 11:46:24 +1100
- Organization: Rising Software Australia Pty Ltd
Hi,
On 17/03/14 23:38, Paul Davis wrote:
If you are trying to sync to audio, you should *always* use the audio
clock as the reference, not the system clock, nor any other clock.
Some systems will provide you will a DLL or PLL to link the system and
audio clocks, but in general if you're already handling audio (i.e.
have registered an IO callback with the HAL), then you can just do the
timing directly yourself.
As someone who also has to avoid all higher level apple APIs in order
to facilitate cross platform portability, welcome to the club :)
Right, as a former designer of complex telecommunications hardware, I
know that the fewer clocks you have the better...
We're not trying to sync to audio though, just get MIDI events out
accurately. In my example I'm trying to play a note every 120ms (eighth
notes at 240 beats per minute). Pretty often the audio is ~20ms late.
Looking into it a bit further I suspect this is one whole 1024 sample
buffer (23ms at 44.1kHz). It catches up for the next event.
How would I use the audio clock as reference anyway? Using
AudioUnitAddRenderNotify?
Though now I see the preferred and current hardware I/O buffer can be
changed/inspected through the audio session. If I'm missing by a sampler
buffer it wouldn't hurt so much if the sample buffer was a lot shorter.
Hamish
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden