• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
AudioBuffer fields in AudioUnitRender
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

AudioBuffer fields in AudioUnitRender


  • Subject: AudioBuffer fields in AudioUnitRender
  • From: Craig Bakalian <email@hidden>
  • Date: Tue, 8 Mar 2005 04:33:20 -0500

Hi,
I am rendering audio data through a AUFormatConvert unit which is connected to any AUEffect unit which is connected to another AUFormat Convert unit, and then blah, blah, blah with the changed data. I am putting the converters at both ends so what sample rate goes in comes out. I know that I have to get the AudioStreamBasicDescription of the data coming in to the converter unit, get the AudioStreamBasicDescription of the effect unit, and set and connect them accordingly. I am pulling the data through to add an effect to the mData.
I have coded stuff like this in the past, but not with such flexibility in what kind of streams are going into the machine described above. Or, in the past I hard coded the AudioBufferList.
In an AudioUnitRender call I must first create and AudioBufferList and then set the mBuffers->[i].mNumberOfChannels, mBuffers->mDataByteSize, and mBuffers[i].mData, and then send the instance in the AudioUnitRender call.
The question here is -> how do I set the fields of the AudioBuffer to coordinate with the AudioStreamBasicDescription of the AudioUnit that is getting the renderCallback?


for(int i = 0;i<numberOfChannels;i++)
{
bufferList->mBuffers[i].mNumberOfChannels = ?
bufferList->mBuffers[i].mDataByteSize = framesPerSlice * sizeof( float, int, unsigned ?)
bufferList->mBuffers[i].mData = NULL; // I know this must be set to NULL
}


I have seen framesPerSlice as 1024 and 512. Can this figure be arbitrary? I mean, I think I know it shouldn't be arbitrary, but is there some documentation for letting this chunk of flesh between the chair and the keyboard know what is going on here? : > )). And, * sizeof(float, int, unsigned) -- wouldn't this need to be coordinated with the mFormatID of the AudioStreamBasicDescription of the AudioUnit whose data is being pulled upon.
I know this is a lot to ask of this list, Is there any documentation for this?


Craig Bakalian


_______________________________________________ Do not post admin requests to the list. They will be ignored. Coreaudio-api mailing list (email@hidden) Help/Unsubscribe/Update your Subscription: This email sent to email@hidden
  • Follow-Ups:
    • Re: AudioBuffer fields in AudioUnitRender
      • From: Doug Wyatt <email@hidden>
  • Prev by Date: Re: MyVolumeUnit example.. does not 'pan'
  • Next by Date: newb: one project can't find CAMath.h while another can ??
  • Previous by thread: Re: Re: Enumerating and switching audio devices (NEWBIE)
  • Next by thread: Re: AudioBuffer fields in AudioUnitRender
  • Index(es):
    • Date
    • Thread