• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Interapplication sync protocol?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Interapplication sync protocol?


  • Subject: Re: Interapplication sync protocol?
  • From: Doug Wyatt <email@hidden>
  • Date: Tue, 8 Oct 2002 10:47:53 -0700

On Tuesday, Oct 8, 2002, at 06:34 US/Pacific, Drayon Detroit wrote:
OMS provided the IAC bus/driver for sync between audio apps it was
generally sloppy and painfull to use.

Im wondering if OS X's Core Audio provides a similar but much tighter
protocol for sync between audio apps?
If so what is this called and where is the information?

Thanks for asking; this is a good thing to work out now.

It would be easy to write a driver modeled on the OMS IAC driver -- it just created source/destination pairs and looped them back. I think I wrote the OMS driver in about 4 hours, including the UI in OMS Setup.

In OMS we had virtual sources and destinations before there was an IAC driver, and something (this was 10 years ago and I don't remember what it was) drove us to create the IAC driver. Here we are again. This is a good opportunity to step back and design something new -- or decide again that an IAC driver is most flexible.

The fact that CoreMIDI has timestamping makes all forms of IAC much simpler. A timing source (master) could run just a few times a second and spit out a stream of timestamped MTC or MIDI clock messages for a small slice of time in the future. Slaves would receive these messages all at once, start/stop, and build a timeline.

Leaving aside for a moment whether this is done with virtual sources or an IAC bus ...

Would developers rather use a new set of API's for communicating synchronization without MIDI messages? It would take time to develop the new API's and for applications to implement support for them. It might not happen until the next major OS release.

Or, does CoreMIDI's timestamping make continuing to use MIDI as the synchronization protocol feasible? It does allow developers to build on the synchronization code they already have instead of having to do something new.

We could agree on and document conventions for using only periodic MTC full messages instead of quarter frame messages in order to thin the data stream -- the receiver is going to be interpolating time in either case, and with timestamped full messages, there's probably no need for a timestamp every quarter frame; perhaps a full message every ~3-5 frames (3: 100-125 ms, 5: 167-208 ms depending on SMPTE format) would suffice.

And then the other question is: Would it be feasible to ask developers of apps that can function as masters to publish virtual sources (slaves would then sync to the virtual sources), or is the IAC bus a preferable model?

Doug

--
Doug Wyatt
work: email@hidden (CoreAudio)
personal: email@hidden http://www.sonosphere.com
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.

References: 
 >Interapplication sync protocol? (From: "Drayon Detroit" <email@hidden>)

  • Prev by Date: Re: Endpoint refcons are NULL
  • Next by Date: Re: MusicDevice AU Sample Code
  • Previous by thread: Interapplication sync protocol?
  • Next by thread: more hints on AU custom graphics
  • Index(es):
    • Date
    • Thread