Re: RemoteIO unit at beginning and end of an AUGraph
Re: RemoteIO unit at beginning and end of an AUGraph
- Subject: Re: RemoteIO unit at beginning and end of an AUGraph
- From: infrequent <email@hidden>
- Date: Mon, 7 Jun 2010 00:40:45 +0100
Hi,
I realise this is a fairly old post but I'm wondering if things
haven't changed in the recent OS update. If I run the
iPhoneMultichannelMixerTest project, the call to
AudioUnitGetProperty(mMixer, kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Input, i, &desc, &size) in
MultiChannelMixerController.mm returns desc,mBitsPerChannel = 32 on my
2G iPod touch _and_ the simulator. Attempting to set this to 16
results in the AUGraphInitialize call failing (with the 'fmt?' error).
I know OS 4 is under NDA but I'd hate to spend hours chasing my tail
only to find that this is the way that things should be.
Alternatively, if this is a bug, I'll need to file a radar. So, what
should it be?
With this in mind, should it be possible to add a
AUGraphSetNodeInputCallback for the remoteIO mic input alongside the
callbacks to the various inputs to the mixer without engaging the
converter AU in the graph?
thanks and regards
Pierre
On Mon, Apr 13, 2009 at 7:01 PM, Chris Adamson <email@hidden> wrote:
> The mixer won't take 32-bit input (OSStatus -10868, format not supported,
> when setting the mixer's input format), so I tried adding an AUConverter to
> downsample 32-to-16 on the simulator (presumably a nice future-proof defense
> for future hardware changes).
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden