Re: Please help with Audio Session and RemoteIO AU on iPhone and iPod touch
Re: Please help with Audio Session and RemoteIO AU on iPhone and iPod touch
- Subject: Re: Please help with Audio Session and RemoteIO AU on iPhone and iPod touch
- From: Inca Rose <email@hidden>
- Date: Sat, 22 Nov 2008 22:35:53 +0200
Thanks for your answers.
Pls see inline with [IR]
Thanks
Inca Rose
On Nov 21, 2008, at 11:13 PM, William Stewart wrote:
On Nov 21, 2008, at 10:13 AM, Inca Rose wrote:
Hi;
First of all, sorry for the long post, but I'm stuck for a few days
now.
I'm in charge of writing the VoIP client part of a game for iPod
touch
and iPhone.
First problem:
the CurrentHardwareSampleRate is not alway set to 8000, sometimes it
is, sometimes it remains on 44100
We had some bugs here, fixed in 2.2. But sometimes you won't get
8000, so you have to deal with that
[IR] Can you please specify the fixes ?
I should expect to get the prefered SampleRate ?
On iPhone without headset it is always 8000, but in iPod Touch it
sometimes is 8000 and sometimes is 44100
yes
You should certainly do what you can to set the hardware up the way
you want it. But if you are also doing a game, you probably don't
want all of the audio in the game to be output at 8KHz
AURIO will do sample rate conversions on both sides though... So,
even if you can't get the hardware at the right sample rate for your
input, you should be able to just tell it that *you* want to see
8KHz and it will do the SRC for you:
Second Problem:
the CurrentHardwareIOBuffer gets different values every run, some
times it gets very close to 0.002, sometimes it is
0.023... and sometimes it is 0.12.... or 0.018. This makes it
impossible to successfully gets samples from the mic
or send samples to the speaker in a proper manner.
The hardware buffers that we give you should be the same duration,
same number of samples each time. There might be a bit of fluctation
due to buffering if there is an SRC in the path though. But I think
you are not thinking about this in the right way, even with these
fluctuations you are able to get continuous audio
[IR] I'm experiencing different values each time I run the
application. The value is in seconds so the number of samples and
bytes returned in the callback depends on the format.
I'm getting values from 0.0023 to 0.12. The continuity of the audio is
OK. The problem is that I need to get samples at fixed intervals to be
able to have a smooth VoIP session.
For example I need to get samples each 10 millis or 20 millis. I can
deal with samples each 2 milis and buffer them, but I can not get
samples each 100 milis because the VoIP conversation
will have a 0.1 seconds delay and in some environments this wont be
acceptable.
Third problem is with the interrupt callback:
I get the callback for example when I press the mic button ( on the
headset ) on an iPod touch
and music starts to play. My app is still running, but when I press
the mic button again
the music stops playing but the callback is not called again, so I
can
not continue
with the VoIP session because my AudioSession never becomes active
again.
Why would the iPod touch be interrupting you - if you want iPod to
play in the background, then you need a category that allows that -
we don't have one for doing input and iPod at the same time. So,
this is going to be mutually exclusive I think (but I am not
entirely clear on what you are trying to do here).
[IR] For example, if you are in the middle of a VoIP call, using an
iPod Touch with the headset with mic, and press the mic button, the
ipod app will start playing in the background ( not mixing with my
audio, just taking over the Active Audio Session ).
My app will receive a call to the interrupt callback to sign the app
that it has been interrupted. When the mic button is pressed again,
the ipod app stops playing music, but the interrupt callback in my app
is not called again. So, I can not activate my Audio Session again.
This is normal behavior ? or something is wrong with the interrupt
callback mechanism.
It is also the case with the aurioTouch example ( it sometimes crash
when the mic button is pressed ).
format.mBitsPerChannel =8;
format.mBytesPerFrame = 1;
format.mBytesPerPacket = 1;
format.mChannelsPerFrame = 1;
format.mFormatFlags = 0;
format.mFormatID = kAudioFormatULaw;
format.mFramesPerPacket = 1;
format.mReserved = 0;
format.mSampleRate = 8000.0;
you can't do this the remote IO should be set to give you pcm, not
ulaw. If you want ulaw you will need to use an audio converter.
[IR] What do you mean "you can't do this" ? I'm doing this and getting
uLaw samples at the input callback. Maybe it is not the best way to do
the conversion ?
I'm transmitting the samples to the network and can hear the voice in
another SIP client configured with uLaw. SO the conversion is done by
the AURIO or some internal audio conversion.
err = AudioUnitSetProperty(rioUnit,
kAudioUnitProperty_StreamFormat,kAudioUnitScope_Output, 1,
&format,sizeof(AudioStreamBasicDescription));
err = AudioUnitSetProperty(rioUnit,
kAudioUnitProperty_StreamFormat,kAudioUnitScope_Input, 0, &format,
sizeof(AudioStreamBasicDescription));
AURenderCallbackStruct renderCallback_struct;
renderCallback_struct.inputProc = callbackRender;
renderCallback_struct.inputProcRefCon = recorder;
err
=AudioUnitSetProperty(rioUnit,kAudioUnitProperty_SetRenderCallback,
kAudioUnitScope_Input, 0,&renderCallback_struct,
sizeof(renderCallback_struct));
AURenderCallbackStruct inputCallback_struct;
inputCallback_struct.inputProc = callbackInput;
inputCallback_struct.inputProcRefCon = recorder;
err
=
AudioUnitSetProperty
(rioUnit,kAudioOutputUnitProperty_SetInputCallback,
kAudioUnitScope_Output, 0,&inputCallback_struct,
sizeof(inputCallback_struct));
err = AudioUnitInitialize(rioUnit);
On the InputCallback I get the samples from the mic to send to the
network. I was expecting to get 16 bytes of uLaw each time so I can
work with 10 millis ( 80 bytes of uLaw )
and 20 milis ( 160 bytes of uLaw ). But as I say before, I only get
16
bytes sometimes and only on the iPhone.
With the same code I also get 64 bytes, 183 bytes, and even 1024
bytes.
ah - well this basically won't work - you need to use an audio
converter for this and this is part of your problem I think.
[IR] What wont work ? There is something wrong with the configuration
in my code ? or is just that the audio format I use ( uLaw ) what is
wrong ?
I just want to be able to get samples at the Input ( render ) callback
at fixed intervals so I can transmit those samples to the network, my
app will take care of any jitter.
Another question:
I'm using now the InputCallback to get samples from the mic and the
renderCallback to pass samples to the speaker.
Can I use the same callback to do both ? For example to use the
renderCallback to read samples from the mic and at the same time to feed
the AudioBufferList with samples from the net to by played ?
Thanks
Inca Rose
Bill
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden