• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Playback timing
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Playback timing


  • Subject: Playback timing
  • From: Craig Hopson <email@hidden>
  • Date: Mon, 19 Dec 2005 17:37:41 -0700

Our application has some custom generators. At playback time we build a MusicSequence with several custom tracks into which we put MIDIMetaEvents (could easily be MusicEventUserData, of course). We then want to play the sequence and support frame or sample accurate playback - meaning that sometimes an event's output must start in the middle of a render buffer. Some of the generators take a bit of time to process the input samples and produce output samples and the number of output samples is not always equal to the number of input samples. The time for this processing varies from unit to unit and event to event, so overall latency is also an issue. Further, we need to update the UI during playback so the user knows what is happening. UI updating must take lower priority than playback, of course.

It seems there are two ways to handle this playback: (Tell me if I have anything wrong or have missed some critical bit...)

1.  Use a MusicPlayer
	- create the AUGraph
	- create the MusicSequence
	- install a callback via MusicSequenceSetUserCallback
		- the callback will dispatch events to the proper generator
		- the callback will notify the UI to update if appropriate
	- create a MusicPlayer and install the sequence
	- pre-roll the MusicPlayer
	- start the MusicPlayer
 	- adjust the headphones

** How is latency handled in this model?
How do we support the notion of preroll in our units?
Do all of the generators need to support some basic AU interface?
How is inter-buffer timing handled? (An event begins in the middle of one render buffer and ends in another.)


2. Handle playback ourselves
- create the AUGraph
In the graph, install AUs which receive events based on the current MusicSequence time stamp.
These AUs will have a (threaded) ring buffer (or similar mechanism) with which they can manage output of samples. They will hold ring data until the output time stamp matches the time stamp associated with their current event, providing sample-accurate timing. Otherwise they will fill the render buffer(s) with zeros.
- create the MusicSequence
- somehow do a pre-roll to prime the AUs
- start the MusicPlayer
- adjust the headphones


-Craig


*************************************************** Craig Hopson, Managing Partner Red Rock Software, Inc. 10 West Broadway, Suite 850 Salt Lake City, UT 84101 office: 801.322.4322 x103 cell: 801.949.3526 http://www.redrocksw.com


_______________________________________________ Do not post admin requests to the list. They will be ignored. Coreaudio-api mailing list (email@hidden) Help/Unsubscribe/Update your Subscription: This email sent to email@hidden
  • Follow-Ups:
    • Re: Playback timing
      • From: William Stewart <email@hidden>
  • Prev by Date: Re: question about converting to lossless
  • Next by Date: Re: Playback timing
  • Previous by thread: Re: question about converting to lossless
  • Next by thread: Re: Playback timing
  • Index(es):
    • Date
    • Thread