• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Q: Setting the size of an AudioUnit buffer?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Q: Setting the size of an AudioUnit buffer?


  • Subject: Re: Q: Setting the size of an AudioUnit buffer?
  • From: "A. Le Flanchec" <email@hidden>
  • Date: Mon, 09 Feb 2004 12:52:22 +0000

Thank you Bill !

I tried to change the device's I/O proc's frame count... but I still have those dropouts in the audiofile I am playing. I guess I should take a look at the "PlayBufferedSound" that I found on the COreAudio Swiki page.

Vince

From: William Stewart <email@hidden>
To: "A. Le Flanchec" <email@hidden>
CC: email@hidden
Subject: Re: Q: Setting the size of an AudioUnit buffer?
Date: Fri, 6 Feb 2004 10:02:44 -0800


On 06/02/2004, at 12:34 AM, A. Le Flanchec wrote:

Hello,

I also have a problem with that. I tried to set the MaximumFramesPerSlice property (up to 65536), but it did not change ioNumberDataPackets (in the example PlayAudioFileLite, as far as I remember) which is still set at 512. Maybe I'm doing something wrong... I used kAudioUnitScope_Global... is it right to do so ?

Max Frames just describes to the AU what the maximum (and we would hope normal and only!) number of frames that you are going to ask an AU to do at any time... The AU then knows how much memory to allocate for buffers (like delay lines, etc...)

It has *nothing* to do with the actual number of frames of the render call.

If you are talking to an Output AU that is talking to a device (like the AUHAL or Default Output units), then the actual number of frames that the AU is going to render is determined by the device's I/O proc's frame count (see <CoreAudio/AudioHardware.h>

By the way, I am completely new to making audio apps. I'm stuck in the middle of a VERY steeeep learning curve, if you see what I mean... I think that I should use different threads for the audio rendering and the GUI. Maybe the fact that we hear drops while playing a file with 'PlayAudioFileLite' is also related to the fact that there are no separate threads ? What do you think ? Is it necessary to use separate threads with different priorities to make a rock solid audio app ? Different threads for the MIDI and the Audio side ?

The much vaunted, and soon to come next rev of the SDK has some completely rewritten audio file playing and recording code that I think will help alot

Bill


OK, these are very basic questions... but I really would appreciate some help, before I completely despair !

Vince


Subject: Re: Q: Setting the size of an AudioUnit buffer? To: Lance Drake <email@hidden>
F>rom: Marc Poirier <email@hidden>
Date: Sun, 1 Feb 2004 14:24:44 -0600 (CST)

You want to set the MaximumFramesPerSlice property (or maybe it's called MaxFramesPerSlice, but whatever, that's the one).

Marc



On Fri, 30 Jan 2004, Lance Drake wrote:

Hi Core-Audio folks,

After pouring thru the online literature, reading the CoreAudio.pdf and scanning thru the headers, I have still not determined how to set the size of the AudioUnit buffer - or where the default value of 512-packets comes from.

As it is, my chore of reading a file to audio is grabbing 512 packets at a clip via the 'ioNumberDataPackets' quantity sent to my inputProcessor routine as setup by the call to AudioConverterFillComplexBuffer which is getting this number from the render process specified by the call to AudioUnitSetProperty where the property is kAudioUnitProperty_SetRenderCallback.

The solution (I imagine) is to also make a call to AudioUnitSetProperty with the property of kAudioUnitProperty_SetExternalBuffer - but what args I should send in with that call.

The net result is what sounds like halting audio output that seems to be a result of the buffer being starved because the OS is off doing other things. My thought was that if I use a larger buffer, this negative effect would be eliminated.

Thanks for your assistance,

Lance Drake
_______________________________________________

_________________________________________________________________
MSN Messenger : discutez en direct avec vos amis ! http://www.msn.fr/msger/default.asp
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.


-- mailto:email@hidden
tel: +1 408 974 4056

________________________________________________________________________ __
"Much human ingenuity has gone into finding the ultimate Before.
The current state of knowledge can be summarized thus:
In the beginning, there was nothing, which exploded" - Terry Pratchett
________________________________________________________________________ __


_________________________________________________________________
MSN Messenger : discutez en direct avec vos amis ! http://www.msn.fr/msger/default.asp
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.
  • Prev by Date: Re: AudioUnitRender returning -1
  • Next by Date: Re: AUCarbonViewControl bug?
  • Previous by thread: Re: Q: Setting the size of an AudioUnit buffer?
  • Next by thread: Re: Communicating between an audiounit and cocoa view?
  • Index(es):
    • Date
    • Thread