• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Render Buffer Sizes Usually Consistent?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Render Buffer Sizes Usually Consistent?


  • Subject: Render Buffer Sizes Usually Consistent?
  • From: James Chandler Jr <email@hidden>
  • Date: Tue, 10 Feb 2009 13:02:46 -0500

Hi. A fuzzy question:

I have a MIDI/Audio app that is fairly complex, but the playback chain is simple. The app is working fine so far and syncs MIDI and Audio tempos in ordinary situations. Am currently writing fail-safe mechanisms to keep the MIDI tempo from running away from the audio in edge cases.

Have not yet encountered edge cases, but it is too optimistic to expect that they will never happen <g>. Am wondering how elaborate the fail-safe needs to be.

I open a DefaultOutputUnit and install a RenderCallBack. I let the DefaultOutputUnit use whatever render buffer size it wishes to use. Hopefully that will make it friendly with the largest number of Mac models.

The RenderCallBack transmits MIDI to a MusicDevice via MusicDeviceMIDIEvent(), calls AudioUnitRender(), and then generates tempo-stretched multitrack audio. Then the RenderCallBack mixes rendered MIDI with the multitrack audio and returns. The app does not use Apple timestretch, sample converter or mixer AU's.

The MIDI seems to have good relative timing. When the app transmits MIDI, it interpolates the expected time location of the MIDI packet to an estimated sample offset into the 'next buffer that will get rendered', and then places the event the proper place in the buffer by using the OffsetSampleFrame parameter of MusicDeviceMIDIEvent(). This gives a one-buffer latency to the MIDI, but good relative timing.

To account for possible edge cases-- it looks easy to micro-adjust the MIDI tempo to agree with possibly off-speed audio hardware samplerates, if the DefaultOutputUnit 'almost always' uses the same NumberOfFrames buffer size in the RenderCallBack (after the DefaultOutputUnit has been started and the DefaultOutputUnit has decided whatever buffer size it wants to use).

However, if it is common for the DefaultOutputUnit to alter its render buffer size 'on the fly'-- If different NumberOfFrames can get requested at unpredictable times, it would introduce a variable MIDI latency situation which would require smarter code. For instance, one render requesting 2000 frames, the next render requesting 500 frames, the next render requesting 3000 frames, whatever. My mechanism would still have good relative MIDI timing, but the latency between MIDI and audio could drift around.

Apologies for such a tedious message. Being naturally lazy, I don't want to over-design the sync mechanism.

Does anyone know if I can count on the DefaultOutputUnit USUALLY doing a RenderCallBack using the same NumberOfFrames render buffer size? Occasional different-sized buffers wouldn't be much problem. A 'permanent switch' from one buffer size to another buffer size wouldn't be much problem. Constantly-varying buffer sizes would be a problem.

Thanks!
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


  • Follow-Ups:
    • Re: Render Buffer Sizes Usually Consistent?
      • From: Brian Willoughby <email@hidden>
    • Re: Render Buffer Sizes Usually Consistent?
      • From: Doug Wyatt <email@hidden>
  • Prev by Date: AudioQueueStop hangs
  • Next by Date: Re: Render Buffer Sizes Usually Consistent?
  • Previous by thread: Re: AudioQueueStop hangs
  • Next by thread: Re: Render Buffer Sizes Usually Consistent?
  • Index(es):
    • Date
    • Thread