• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Multi channel mixer - kAudioUnitSubType_MultiChannelMixer
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Multi channel mixer - kAudioUnitSubType_MultiChannelMixer


  • Subject: Multi channel mixer - kAudioUnitSubType_MultiChannelMixer
  • From: Aran Mulholland <email@hidden>
  • Date: Sun, 24 May 2009 20:58:31 +1000


Ive been attempting to use some code i got off a thread on this mailing list. Im mixing stereo audio files using the  kAudioUnitSubType_MultiChannelMixer, and my output is horribly distorted. I can still hear the basic structure of the wave file i am playing, at first i thought it was clipping so i turned the output down (in my callback by dividing the samples / 4), no good, just quieter distortion :)

I usually wouldn't post at such an early stage of research into a new development concept (for me) but i have been trawling the web and cant seem to find any resources for learning this stuff


the initialisation code is :-

  AudioComponentDescription mixerDescription, outputDescription;
    AUNode mixerNode;
    AUNode outputNode;
    AUGraph graph;
    OSErr err = noErr;
    err = NewAUGraph(&graph);
    NSAssert(err == noErr, @"Error creating graph.");
    mixerDescription.componentFlags = 0;
    mixerDescription.componentFlagsMask = 0;
    mixerDescription.componentType = kAudioUnitType_Mixer;
    mixerDescription.componentSubType = kAudioUnitSubType_MultiChannelMixer;
    mixerDescription.componentManufacturer = kAudioUnitManufacturer_Apple;
    err = AUGraphAddNode(graph, &mixerDescription, &mixerNode);
    NSAssert(err == noErr, @"Error creating mixer node.");
    outputDescription.componentFlags = 0;
    outputDescription.componentFlagsMask = 0;
    outputDescription.componentType = kAudioUnitType_Output;
    outputDescription.componentSubType = kAudioUnitSubType_RemoteIO;
    outputDescription.componentManufacturer = kAudioUnitManufacturer_Apple;
    err = AUGraphAddNode(graph, &outputDescription, &outputNode);
    NSAssert(err == noErr, @"Error creating output node.");
    err = AUGraphOpen(graph);
    NSAssert(err == noErr, @"Error opening graph.");
    AURenderCallbackStruct callback;
    callback.inputProc = TestCallback;
    callback.inputProcRefCon = self;
   
    //mixer channel 0
    err = AUGraphSetNodeInputCallback(graph, mixerNode, 0, &callback);
    NSAssert(err == noErr, @"Error setting render callback.");
   
    err = AUGraphConnectNodeInput(graph, mixerNode, 0, outputNode, 0);
    NSAssert(err == noErr, @"Error connecting mixer to output.");
    err = AUGraphInitialize(graph);
    NSAssert(err == noErr, @"Error initializing graph.");
    err = AUGraphStart(graph);
    NSAssert(err == noErr, @"Error starting graph.");
    CAShow(graph);

and the callback is as follows :-

OSStatus TestCallback(void* inRefCon, AudioUnitRenderActionFlags* ioActionFlags, const AudioTimeStamp* inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList* ioData)
{
     
    //get a copy of the objectiveC class "self" we need this to get the next sample to fill the buffer
    RemoteIOPlayer *remoteIOplayer = (RemoteIOPlayer *)inRefCon;
   
    AudioBuffer *buf = ioData->mBuffers;
    for (UInt32 i = ioData->mNumberBuffers; i--; ++buf)
        memset((Byte *)buf->mData, 0, buf->mDataByteSize);
   
    //loop through all the buffers that need to be filled
    for (int i = 0 ; i < ioData->mNumberBuffers; i++){
        //get the buffer to be filled
        AudioBuffer buffer = ioData->mBuffers[i];

        SInt16 *frameBuffer = buffer.mData;
       
        //loop through the buffer and fill the frames
        for (int j = 0; j < inNumberFrames; j++){
            if(inBusNumber == 0){
                if(i == 0){
                    // get getNextLeftPacket returns a UInt16 value, one frame.
                    frameBuffer[j] = [[remoteIOplayer inMemoryAudioFile] getNextLeftPacket];
                }
                else if (i == 1){
                    // get getNextRightPacket returns a UInt16 value, one frame.
                    frameBuffer[j] = [[remoteIOplayer inMemoryAudioFile] getNextRightPacket];
                }
            }           
        }
    }
    return noErr;
}
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

  • Follow-Ups:
    • Re: Multi channel mixer - kAudioUnitSubType_MultiChannelMixer
      • From: tahome izwah <email@hidden>
  • Prev by Date: Re: Error -66626
  • Next by Date: Re: call render
  • Previous by thread: Re: Differences between iPhone and Desktop Audio
  • Next by thread: Re: Multi channel mixer - kAudioUnitSubType_MultiChannelMixer
  • Index(es):
    • Date
    • Thread