Re: Q: Setting the size of an AudioUnit buffer?
Re: Q: Setting the size of an AudioUnit buffer?
- Subject: Re: Q: Setting the size of an AudioUnit buffer?
- From: "A. Le Flanchec" <email@hidden>
- Date: Fri, 06 Feb 2004 08:34:29 +0000
Hello,
I also have a problem with that. I tried to set the MaximumFramesPerSlice
property (up to 65536), but it did not change ioNumberDataPackets (in the
example PlayAudioFileLite, as far as I remember) which is still set at 512.
Maybe I'm doing something wrong... I used kAudioUnitScope_Global... is it
right to do so ?
By the way, I am completely new to making audio apps. I'm stuck in the
middle of a VERY steeeep learning curve, if you see what I mean... I think
that I should use different threads for the audio rendering and the GUI.
Maybe the fact that we hear drops while playing a file with
'PlayAudioFileLite' is also related to the fact that there are no separate
threads ? What do you think ? Is it necessary to use separate threads with
different priorities to make a rock solid audio app ? Different threads for
the MIDI and the Audio side ?
OK, these are very basic questions... but I really would appreciate some
help, before I completely despair !
Vince
Subject: Re: Q: Setting the size of an AudioUnit buffer? To: Lance Drake
<email@hidden>
F>rom: Marc Poirier <email@hidden>
Date: Sun, 1 Feb 2004 14:24:44 -0600 (CST)
You want to set the MaximumFramesPerSlice property (or maybe it's called
MaxFramesPerSlice, but whatever, that's the one).
Marc
On Fri, 30 Jan 2004, Lance Drake wrote:
Hi Core-Audio folks,
After pouring thru the online literature, reading the CoreAudio.pdf
and scanning thru the headers, I have still not determined how to set the
size of the AudioUnit buffer - or where the default value of 512-packets
comes from.
As it is, my chore of reading a file to audio is grabbing 512 packets
at a clip via the 'ioNumberDataPackets' quantity sent to my inputProcessor
routine as setup by the call to AudioConverterFillComplexBuffer which is
getting this number from the render process specified by the call to
AudioUnitSetProperty where the property is
kAudioUnitProperty_SetRenderCallback.
The solution (I imagine) is to also make a call to AudioUnitSetProperty
with the property of kAudioUnitProperty_SetExternalBuffer - but what args I
should send in with that call.
The net result is what sounds like halting audio output that seems to
be a result of the buffer being starved because the OS is off doing other
things. My thought was that if I use a larger buffer, this negative effect
would be eliminated.
Thanks for your assistance,
Lance Drake
_______________________________________________
_________________________________________________________________
MSN Messenger : discutez en direct avec vos amis !
http://www.msn.fr/msger/default.asp
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.