• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
How to process a buffer of 64-bit audio using an unconnected Audio Unit
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

How to process a buffer of 64-bit audio using an unconnected Audio Unit


  • Subject: How to process a buffer of 64-bit audio using an unconnected Audio Unit
  • From: Michael Norris <email@hidden>
  • Date: Thu, 08 Aug 2013 16:21:05 +1200

I'm writing a plug-in for Max/MSP which processes buffers of 64-bit audio using an Audio Unit. I have opened and initialized the Audio Unit fine, but have not attached it to any graph, because Max/MSP has to send me buffers of 64-bit data, which I have to process manually.

To do the processing, I call AudioUnitRender, but I'm a bit unclear as to what the best way to tell the Audio Unit what buffers to process: I don't quite understand when an Audio Unit is disconnected, how you pass it an input buffer. Currently, the way I'm doing this is to set up an AURenderCallback on the kAudioUnitScope_Input, and then inside the render callback I point the AudioBuffers to the 64-bit buffers that Max/MSP gives me. But I need to do this - is there a simpler way?

FYI, it's producing audio, but it's distorted, as if I've got the endianness, or the float size, or something wrong.

Here's how it looks (I've stripped out a few variable definitions just to keep it tidy)

// set up Audio Unit
{
err = AudioComponentInstanceNew(comp,&(x->audioUnit));
		if (err) {
			//bail
		} else {
			result = AudioUnitInitialize(x->audioUnit);
			if (result) {
				//bail
			} else {
				input.inputProc = (AURenderCallback) ao_coreaudio_render_proc;
				input.inputProcRefCon = x;
				result = AudioUnitSetProperty (x->audioUnit,
											   kAudioUnitProperty_SetRenderCallback,
											   kAudioUnitScope_Input,
											   0, &input, sizeof(input));

				if (result) {
					//bail
				} else {
					x->callbackSet = TRUE;
					x->x_myTimeStamp.mSampleTime = 0;
					x->x_myTimeStamp.mFlags = kAudioTimeStampSampleTimeValid;
					x->x_theAudioData = (AudioBufferList *) malloc (offsetof(AudioBufferList, mBuffers[kNumChannels]));
					x->x_theAudioData->mNumberBuffers = kNumChannels;

					for ( i = 0; i < kNumChannels; i++)
					{
						x->x_theAudioData->mBuffers[i].mNumberChannels = 1;
					}

                    // set audio stream format
                    AudioStreamBasicDescription asbd;
                    asbd.mSampleRate = sys_getsr(); // asks Max for the current sample rate
                    asbd.mFormatID = kAudioFormatLinearPCM;
                    asbd.mFormatFlags = kAudioFormatFlagsAudioUnitCanonical;
                    asbd.mBytesPerPacket = sizeof(double);
                    asbd.mFramesPerPacket = 1;
                    asbd.mBytesPerFrame  = sizeof(double);
                    asbd.mChannelsPerFrame = 1;
                    asbd.mBitsPerChannel = 8 * sizeof(double);
                    asbd.mReserved = 0;

                    result = AudioUnitSetProperty (x->audioUnit,
                                                   kAudioUnitProperty_StreamFormat,
                                                   kAudioUnitScope_Global,
                                                   0, &asbd, sizeof(asbd));

}

// MaxMSP calls this routine for every audio buffer
void au_perform64(t_au *x, t_object *dsp64, double **ins, long numins, double **outs, long numouts, long sampleframes, long flags, void *userparam) {
if (numins > 0) inL = ins[0]; else inL = nil;
    if (numins > 1) inR = ins[1]; else inR = nil;
    if (numouts > 0) outL = outs[0]; else outL = nil;
    if (numouts > 1) outR = outs[1]; else outR = nil;
for ( i = 0; i < kNumChannels; i++) {
        x->x_theAudioData->mBuffers[i].mData = NULL; // not sure if I need to do this
    }
result = AudioUnitRender(x->audioUnit,
                             &actionFlags,
                             &x->x_myTimeStamp,
                             0,
                             sampleframes,
                             x->x_theAudioData);
memcpy(outL, x->x_theAudioData->mBuffers[0].mData, chunkSizeInBytes); // copy out to Max's own buffers
memcpy(outR, x->x_theAudioData->mBuffers[0].mData, chunkSizeInBytes); // copy out to Max's own buffers
x->x_myTimeStamp.mSampleTime += sampleframes;

}

// This is my callback proc, called by AudioUnitRender
static OSStatus ao_coreaudio_render_proc (void *inRefCon,
                                          AudioUnitRenderActionFlags *ioActionFlags,
                                          const AudioTimeStamp *inTimeStamp,
                                          UInt32 inBusNumber,
                                          UInt32 inNumberFrames,
                                          AudioBufferList * ioData) {
	t_au	*x = (t_au*)inRefCon;
    long numBytes = inNumberFrames*sizeof(double);

	// supply the AU with our input buffers
	ioData->mBuffers[0].mData = x->inL;
    ioData->mBuffers[0].mNumberChannels = 1;
    ioData->mBuffers[0].mDataByteSize = numBytes;
	if (x->inR != nil && x->inR != x->inL) {
        ioData->mBuffers[1].mData = x->inR;
        ioData->mBuffers[1].mNumberChannels = 1;
        ioData->mBuffers[1].mDataByteSize = numBytes;
    }
	post("Render: x->inL %ld x->inR %ld n bytes:%ld",x->inL,x->inR,numBytes);
    return noErr;
}


————————————————
MICHAEL NORRIS
Senior Lecturer, Composition & Sonic Art
Editor, Waiteata Music Press
New Zealand School of Music
PO Box 2332
Wellington
NEW ZEALAND

ph		+64 4 463 7456
mob 	+64 21 211 0138
web		www.michaelnorris.info









 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden


  • Follow-Ups:
    • Re: How to process a buffer of 64-bit audio using an unconnected Audio Unit
      • From: Michael Norris <email@hidden>
  • Prev by Date: Re: Save the HTTP Streaming Audio in iOS Device
  • Next by Date: Re: Save the HTTP Streaming Audio in iOS Device
  • Previous by thread: Re: Save the HTTP Streaming Audio in iOS Device
  • Next by thread: Re: How to process a buffer of 64-bit audio using an unconnected Audio Unit
  • Index(es):
    • Date
    • Thread