• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
MIDI / device latency
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

MIDI / device latency


  • Subject: MIDI / device latency
  • From: Gregory Wieber <email@hidden>
  • Date: Mon, 27 Dec 2010 21:22:10 -0800

Hello,

A while ago, I posted about a sequencer project.  I ended up coding my sequencer by counting samples in a render callback (in an audio unit graph) and processing the next 'step' of the music sequence when a certain number of samples pass.  

I'm now attempting to send MIDI events (over WiFi with an iOS device), and have run into an issue with scheduling the events.  It seems that the host time I'm sending with the MIDI event is not far enough in the future.  I think this is because there is a latency that I'm not taking into account.  (The jittering of notes is not as evident when I'm running in the simulator.)

Please note that while WiFi introduces some latency,  my app also sends MIDI signals on key-presses, and these are extremely accurate.  I've deduced that the problem must lie in either the buffer length (which needs to stay as big as it is for performance reasons) and/or the latency between when the buffer is calculated and the audio actually hits the speakers.  

I can't seem to figure out the proper way to calculate latency in an Audio Unit Graph setup (I have multiple multi-channel mixers, each with their own sequencer).  I thought a possible solution might be to figure out how much latency there is,  and then add that to my MIDI timestamps to compensate.  Does that sound correct?  Could anyone help me in terms of how I actually get the latency ?

Currently, I'm offsetting my MIDI events like this (where i is my current position in my rendercallback's buffer):

UInt64 outHostTime = inTimeStamp->mHostTime + ((i/44100)*hostTimeFreq);


And, I thought the solution might be:


UInt64 outHostTime = inTimeStamp->mHostTime + ((i/44100)*hostTimeFreq) + latency;



And, just in general, has anyone had any experience with jittering notes with relation to the size of the buffer?  That is, more perceptible note-jitter when the buffer is larger (as it is on the device vs the simulator).

Any help is greatly appreciated.  Thanks, and happy holidays!

best,

Greg


 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

  • Prev by Date: Re: AU Instrument - Cocoa UI
  • Next by Date: Remote Control Events and AVAssetReader
  • Previous by thread: How to implement cocoa UI?
  • Next by thread: Remote Control Events and AVAssetReader
  • Index(es):
    • Date
    • Thread