• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Output AU buffer size for RenderCallback
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Output AU buffer size for RenderCallback


  • Subject: Re: Output AU buffer size for RenderCallback
  • From: William Stewart <email@hidden>
  • Date: Mon, 31 Oct 2005 11:14:35 -0800


On 30/10/2005, at 11:03 AM, Tim Dorcey wrote:

I found mention in archives here that the value of "inNumberFrames" passed
to a RenderCallback for default Output Unit is determined by the attached
device's "I/O proc frame count." Is this a property that clients can (or
should) change?

Yes

On my system, talking to "Built-in Audio," it seems to want about 12 msec
per render callback. This is fine, and anything in the range < 100 msec
would be fine for my app. Do I need to concern myself with getting
something outside that range, and how do I change it if so?

If you are wanting to go bigger, then you can for some devices - you could or instance set the duty cycle to be around 24 msecs (1024 samples at 44.1KHz). You have to be prepared though that some devices may not want to be set at a different I/O size, and just live with what they work with.



Also, I based my code on a "Quick and Dirty" example that was posted here,
which was nothing other than:
FindNextComponent() kAudioUnitSubType_DefaultOutput
OpenAComponent()
AudioUnitSetProperty() kAudioUnitProperty_StreamFormat
AudioUnitSetProperty() kAudioUnitProperty_SetRenderCallback
AudioUnitInitialize()
AudioUnitStart()
...
AudioUnitStop()
AudioUnitUninitialize()
CloseComponent()


Then, my render callback simply fills the AudioBufferList presented to it.
Is that all there is to it?

Yes. That's it. simple enough I think!

Also, is it correct to presume that if I Open/Start multiple instances of
the Output AU, the audio streams will be mixed at final output? And, each
could have a different stream format (e.g., sampling rate)?

Yes - but this is the least desirable place to have your mixing done. Its better to slightly expand your logic to include a mixer AU feeding data to the output unit (make a connection between the two), then your "multiple output units" would just become multiple inputs to the mixer. The AUGraph code can be used to help with connecting AUs together... The mixer AU's have volume controls on their inputs, etc, so this can all become handy.


Bill


Thanks, Tim

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden

--
mailto:email@hidden
tel: +1 408 974 4056
________________________________________________________________________ __
"Much human ingenuity has gone into finding the ultimate Before.
The current state of knowledge can be summarized thus:
In the beginning, there was nothing, which exploded" - Terry Pratchett
________________________________________________________________________ __


_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


References: 
 >Output AU buffer size for RenderCallback (From: "Tim Dorcey" <email@hidden>)

  • Prev by Date: Re: Tiger Audio Units and 96kHz
  • Next by Date: Re: Problems with first CoreAudio Example : (
  • Previous by thread: Output AU buffer size for RenderCallback
  • Next by thread: Problems with first CoreAudio Example : (
  • Index(es):
    • Date
    • Thread