• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Questions on mixing .wav files
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Questions on mixing .wav files


  • Subject: Re: Questions on mixing .wav files
  • From: Aran Mulholland <email@hidden>
  • Date: Sat, 8 Aug 2009 19:38:28 +1000

here is some sample code, to get you started.

some of the code is a bit dodgy, memory management up the spout, i wrote it when id just begun and haven't been back to fix it.

but have a look, there is code there to play a few wave files and use a mixer.



On Sat, Aug 8, 2009 at 6:07 PM, Jay Bone <email@hidden> wrote:

Hello CoreAudio gurus,

I am trying to mix multiple small .wav files to play on an iPhone 3.0.

Prior to the 3.0.1 SDK (using 2.2.1) I achieved this quite easily using AudioServicesPlaySystemSound() and an NSThread with sleepForTimeInterval.
But something must have changed wrt either NSThread sleeping, or AudioServicesPlaySystemSound as this approach no longer works for me in 3.0.

My suspicion is that instead the AudioUnit APIs should be used, and that is all new to me.
So I want to define an AudioUnit output node of subtype kAudioUnitSubType_RemoteIO
I then connect this to a kAudioUnitSubType_MultiChannelMixer.

Next would be connect my .wav's to the kAudioUnitSubType_MultiChannelMixer, but it is not clear to me how this is done.

Is there an AudioUnit which can be used to wrap a .wav file?
I've found this post to this group:
http://lists.apple.com/archives/coreaudio-api/2009/Jul/msg00066.html
Which seems to indicate the following can be done:
ExtAudioFileOpenURL(voice) -
| - AUMixer(kAudioUnitSubType_MultiChannelMixer) - AUOutput(kAudioUnitSubType_RemoteIO)
ExtAudioFileOpenURL(music) -


What I don't understand is, how is an ExtAudioFileOpenURL() "connected" to the mixer AudioUnit? The docs show ExtAudioFileOpenURL() interface defined as
OSStatus ExtAudioFileOpenURL (
CFURLRef inURL,
ExtAudioFileRef *outExtAudioFile
);
Can I "connect" (via AUGraphConnectNodeInput() ) an ExtAudioFileRef to a mixer node? If so, it's not clear to me from the docs how this is done.

Also I'd like to mix in the same .wav multiple times at different time offsets/intervals  into the output. I noticed some kind of Delay Audio Unit in the docs, but nothing mentioned for iPhone specifically. E.g. a bassdrum .wav sound every .125 seconds. Is there a delay audio unit which can be used to achieve this?
Can this all be done without implementing my own callbacks to manually fill buffers?

Thanks in advance for any tips or help with this.
-J












 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

References: 
 >Questions on mixing .wav files (From: Jay Bone <email@hidden>)

  • Prev by Date: Questions on mixing .wav files
  • Next by Date: AudioQueue - Setting channel layout
  • Previous by thread: Questions on mixing .wav files
  • Next by thread: Re: Questions on mixing .wav files
  • Index(es):
    • Date
    • Thread