• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: playing audio files separated by specified time intervals
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: playing audio files separated by specified time intervals


  • Subject: Re: playing audio files separated by specified time intervals
  • From: Brian Willoughby <email@hidden>
  • Date: Mon, 22 Dec 2008 00:22:46 -0800

Whether you are writing an AudioUnit, or simply generating audio in an application and sending it to the default output, you have the option of checking the very accurate time line provided by CoreAudio. For each buffer requested, CoreAudio provides the time stamp of the first sample in the buffer. If your program jots down a reference point in time when the user presses Play, or any other method you might use to establish a "zero" time, then you can decide whether a given buffer should contain one of your audio files. There are lots of conversion API to change the time stamps to the time format that makes most sense to you. You will need to keep track of the sample offset in each file, but I think AudioFIle or ExtAudioFile would help with that.

If you don't want any overlap of audio, that's all you need to do.

If you do want to mix audio - which is what you would need to do if any of those audio files would overlap - then you could build an AudioGraph using one of the mixers, and provide a callback for each mixer input. They'd all have the same timeline, but each audio file would have a different starting point.

Finally, there might be an easier way to do what you want than what I've described above - this is just the first thing that I considered as an option.

Brian Willoughby
Sound Consulting


On Dec 21, 2008, at 21:24, Maissam Barkeshli wrote:
Hi, I'm new to the core audio API, I wonder if someone can point me in the right direction here.


I'm trying to create a highly customized metronome, but I'm having trouble with timing/accuracy/stability issues. I have a bunch of different audio files that I would like to play in succession, separated by specified time intervals. What is the best way to do this? Having the program wait using some kind of sleep() function doesn't seem to be accurate enough. If the computer is remotely busy, or if the user decides to change windows to another application, the timing goes off.

_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


  • Follow-Ups:
    • Re: playing audio files separated by specified time intervals
      • From: Maissam Barkeshli <email@hidden>
References: 
 >playing audio files separated by specified time intervals (From: Maissam Barkeshli <email@hidden>)

  • Prev by Date: Re: Objective C and C++
  • Next by Date: Re: playing audio files separated by specified time intervals
  • Previous by thread: playing audio files separated by specified time intervals
  • Next by thread: Re: playing audio files separated by specified time intervals
  • Index(es):
    • Date
    • Thread