• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Stereo recording with remoteIO - how?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Stereo recording with remoteIO - how?


  • Subject: Stereo recording with remoteIO - how?
  • From: Alex Gross <email@hidden>
  • Date: Thu, 15 Dec 2011 21:16:17 +0100

I developed an iOS low-latency audio app using remoteIO for 3 years now and I thought I knew my way around AudioUnits. Mono recording works fine, but then I tried for 2 days to get stereo recording working and I had no luck. I tested it with an iPad in the Alesis IO Dock and with GuitarJack on an old iPod Touch. GarageBand for iOS offers channel selection and records stereo just fine. All I'm getting is the left channel.

Here's my question:

In a recording callback

static OSStatus recordingCallback(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData)
{
...
AudioBufferList *inputBufferList = calloc(1, offsetof(AudioBufferList, mBuffers) + (sizeof(AudioBuffer)));
inputBufferList->mNumberBuffers = 1;
inputBufferList->mBuffers[0].mNumberChannels = 2;
inputBufferList->mBuffers[0].mDataByteSize = 2 * sizeof(SInt16) * inNumberFrames;
inputBufferList->mBuffers[0].mData = malloc(inputBufferList->mBuffers[0].mDataByteSize);
AudioUnitRender(audioUnit, ioActionFlags, inTimeStamp, 1, inNumberFrames, inputBufferList); // 1 stands for the input bus
...
}

where during initialization, the format was set to

AudioStreamBasicDescription inputFormat = { 0 };
inputFormat.mSampleRate = 44100;
inputFormat.mFormatID = kAudioFormatLinearPCM;
inputFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
inputFormat.mFramesPerPacket = 1;
inputFormat.mChannelsPerFrame = 2; // of course I detected mono/stereo before and set this to 1 if only mono is available
inputFormat.mBitsPerChannel = 16;
inputFormat.mBytesPerFrame = inputFormat.mChannelsPerFrame * sizeof(SInt16);
inputFormat.mBytesPerPacket = inputFormat.mBytesPerFrame * inputFormat.mFramesPerPacket;
UInt32 inputFormatSize = sizeof(inputFormat);
AudioUnitSetProperty(audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 1, &inputFormat, inputFormatSize); // 1 stands for the input bus

How do I retrieve both channels from inputBufferList?


FAIL 1:
Since I didn't specify kAudioFormatFlagIsNonInterleaved, I expected to get 1 buffer with both channels interleaved. Is the size of the buffer (2 * inNumberFrames * sizeof(SInt16)) or (inNumberFrames * sizeof(SInt16)) ?

De-interleaving the data doesn't work:

SInt16 *stereoBuffer = malloc(inputBufferList->mBuffers[0].mDataByteSize);
SInt16 *stereoBufferPtr = stereoBuffer;
SInt16 *leftBufferPtr = inputBufferList->mBuffers[0].mData;
SInt16 *rightBufferPtr = inputBufferList->mBuffers[0].mData;
rightBufferPtr += inNumberFrames;
for(UInt32 i = 0; i < inNumberFrames; ++i)
{
*stereoBufferPtr++ = *leftBufferPtr++;
*stereoBufferPtr++ = *rightBufferPtr++;
}
// ... copying stereoBuffer into my output buffer ...
free(stereoBuffer);


FAIL 2:
I also tried setting kAudioFormatFlagIsNonInterleaved and therefore allocating 

inputBufferList->mNumberBuffers = 2;
for(UInt32 i = 0; i < inputBufferList->mNumberBuffers; ++i)
{
inputBufferList->mBuffers[0].mNumberChannels = 1;
inputBufferList->mBuffers[0].mDataByteSize = sizeof(SInt16) * inNumberFrames;
inputBufferList->mBuffers[0].mData = malloc(inputBufferList->mBuffers[0].mDataByteSize);
}


Is it right to assume that I first need to set the kAudioUnitProperty_StreamFormat and then allocate an AudioBufferList, and that both need to be compatible and that they define how AudioUnitRender stores the data into my AudioBufferList?

What am I overlooking or doing wrong? Is there a good example or documentation anywhere on the internet? The aurioTouch example didn't help because it does input and output in the same callback. Thanks in advance for any advice!
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

  • Follow-Ups:
    • Re: Stereo recording with remoteIO - how?
      • From: tom zicarelli <email@hidden>
  • Prev by Date: RE: IOS AUGraph: record mic and play file
  • Next by Date: Re: IOS AUGraph: record mic and play file
  • Previous by thread: Question about capabilities of audio HAL plug-in
  • Next by thread: Re: Stereo recording with remoteIO - how?
  • Index(es):
    • Date
    • Thread