AudioBuffers and the RenderProc
AudioBuffers and the RenderProc
- Subject: AudioBuffers and the RenderProc
- From: Francois Hamel <email@hidden>
- Date: Wed, 28 Jul 2004 13:53:58 -0400
ok,
I got another bunch of questions
I looked at the OpenAL source code and it seems they are removing the
RenderCallback when they want to stop a sound while it's playing.
Should this be OK or there's something else I don't quite grasp? I
don't want to stop all sounds, but mainly one Input Bus on the 3D Mixer
from "playing".
Also, is there a relation between the number of buffers in the
AudioBufferList passed in in the RenderProc and the number of output
channels? (example: 2 speakers -> 2 buffers, 5 speakers -> 5 buffers )
Is it possible also to use interleaved stereo output? It seems the
current OpenAL implementation always de-interleave the data before
outputing it. Is there a reason behind that?
Frank
On 04-07-27, at 19:01, William Stewart wrote:
>
You can't stop an AU like this... you have to provide valid buffers or
>
return an err.
>
>
You should provide buffers of silence (and set the render flag to
>
indicate silence)
>
>
You stop by stopping the AU (OutputUnit) that is asking the 3DMixer to
>
produce data
>
>
Bill
>
>
On 27/07/2004, at 2:15 PM, Francois Hamel wrote:
>
>
> Hi,
>
>
>
> I know this may sound a little silly but I'm kinda new with CoreAudio
>
> and I'm having problems trying to use the 3DMixer.
>
>
>
> In my RenderCallback I did the same thing they did in the sample which
>
> is setting the AudioBuffers to NULL like:
>
>
>
> else
>
> {
>
> // there aren't any more packets to read.
>
> // Set the amount of data read (mDataByteSize) to zero
>
> // and return noErr to signal the AudioConverter there are
>
> // no packets left.
>
>
>
> ioData->mBuffers[0].mData = NULL;
>
> ioData->mBuffers[0].mDataByteSize = 0;
>
> gIsPlaying=FALSE;
>
> err = noErr;
>
> }
>
>
>
> this is taken directly from the sample (PlayAudioFileLite). The
>
> program
>
> does work when executed normally but in debug mode it produces an
>
> access violation a little after having stopped on a breakpoint in this
>
> "else" block. This is probably related to the fact that the program
>
> sleeps a little and re-check the variable "gIsPlaying" in a loop and
>
> maybe when run normally, the renderer never gets to use the
>
> AudioBuffer
>
> data before time runs out?
>
>
>
> Anyway, all I want to know is how can I stop a sound from playing ? Do
>
> I need to fill my buffer with silence and always provide a valid audio
>
> buffer at all time for each Input Bus of the 3DMixer?
>
>
>
> I got another question also, do I need to setup one RenderCallback for
>
> each Input Bus of the 3D Mixer Unit or just 1 for the whole unit? The
>
> only 3D Mixer "tutorial" I could find didn't have the information
>
> related to actually playing a sound (Technical Note TN2112: Using the
>
> 3DMixer Audio Unit)
>
>
>
> thanks a lot,
>
>
>
> Frank
>
> _______________________________________________
>
> coreaudio-api mailing list | email@hidden
>
> Help/Unsubscribe/Archives:
>
> http://www.lists.apple.com/mailman/listinfo/coreaudio-api
>
> Do not post admin requests to the list. They will be ignored.
>
>
>
>
>
--
>
mailto:email@hidden
>
tel: +1 408 974 4056
>
>
_______________________________________________________________________
>
___
>
Culture Ship Names:
>
Ravished By The Sheer Implausibility Of That Last Statement [GSV]
>
I said, I've Got A Big Stick [OU]
>
Inappropiate Response [OU]
>
Far Over The Borders Of Insanity And Still Accelerating [Eccentric]
>
_______________________________________________________________________
>
___
>
>
Frank
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.