Audio session confusion
Audio session confusion
- Subject: Audio session confusion
- From: Jack Nutting <email@hidden>
- Date: Wed, 02 May 2012 10:40:55 +0200
I'm working on an app that plays multiple synchronized audio loops
using an audio units graph. My graph currently contains 54 mixer
inputs, rendered via an AURenderCallback (though I'll soon likely
switch this to AUFilePlayer), but they're never all playing at once.
Currently no more than 6 are playing at any time, though I've had 12
or more going without any trouble. The tracks that aren't meant to be
playing at any point in time are disabled using
kMultiChannelMixerParam_Enable. Depending on what the user does in the
app, channels are disabled and enabled on the fly, and it all works
well and sounds great and seemingly draws almost no CPU. Remarkable!
So far so good. Now I want to play some one-off samples in response to
user events, and here's where it gets weird. On an iPad, the sound
effects play just fine, but on an iPhone (4s), they are completely
silent. I first tried playing the samples using SimpleAudioEngine from
CocosDenshion, and when that didn't work switched to using
AudioServicesCreateSystemSoundID() + AudioServicesPlaySystemSound(),
but it made no difference. The only way the sound effects were playing
on iPhone were if the ring/mute switch is in the "on" position.
The audio session's category was set to
AVAudioSessionCategoryPlayback, so I'd expected both the loops and the
one-offs to make noise, but no. Even with headphones plugged in, the
sound effects are silent if the switch is in the mute position. So, I
tried switching the audio session category to
AVAudioSessionCategorySoloAmbient, but that has its share of weirdness
too. Without headphones it now works acceptably (both the loops and
one-offs are silenced together when the switch is muted), but with
headphones plugged in, the mute switch now silences the one-offs, but
not the loops!
There is clearly some piece of knowledge I'm missing here. If it makes
any difference, Apple's MixerHostAudio example was the basis for the
audio units portion of this app, and I still have the
audioRouteChangeListenerCallback() function from there fully intact,
but this function doesn't seem to actually *do* much beyond reading
some values, so I don't know if that's relevant at all.
Any ideas about what I'm doing wrong here would be greatly appreciated!
--
// Jack Nutting
// email@hidden
// http://nuthole.com
// http://learncocoa.org
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden