• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Stereo recording with remoteIO - how?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Stereo recording with remoteIO - how?


  • Subject: Re: Stereo recording with remoteIO - how?
  • From: "email@hidden" <email@hidden>
  • Date: Sun, 18 Dec 2011 12:29:14 +0100

Hi Tom,

Thanks a million for pointing me in the right direction. The reason my code only worked for mono input was that I specified sizeof(SInt16) instead of sizeof(AudioUnitSampleByte) which is 4bytes. From AudioGraph's documentation:

"You can set the remote IO input bus (mic) to SInt16, as long as you are only using one channel. SInt16 format streams will not work with multiple channels, (stereo)."

Why is this not on page 1 of Apple's remoteIO documentation? Anyway, AudioUnitRender now reads 8.24 samples (regardless of mono/stereo) and converts them to SInt16 format. If anyone is interested in the code, ask away and I'll post some snippets.

Alex



2011/12/15 tom zicarelli <email@hidden>
Hi Alex,
 
set asbd with kAudioFormatFlagsAudioUnitCanonical  and set channels to 2
 
like this:
 
    size_t bytesPerSample = sizeof (AudioUnitSampleType);
    // Fill the application audio format struct's fields to define a linear PCM,
    //        stereo, noninterleaved stream at the hardware sample rate.
    stereoStreamFormat.mFormatID          = kAudioFormatLinearPCM;
    stereoStreamFormat.mFormatFlags       = kAudioFormatFlagsAudioUnitCanonical;
    stereoStreamFormat.mBytesPerPacket    = bytesPerSample;
    stereoStreamFormat.mFramesPerPacket   = 1;
    stereoStreamFormat.mBytesPerFrame     = bytesPerSample;
    stereoStreamFormat.mChannelsPerFrame  = 2;                    // 2 indicates stereo
    stereoStreamFormat.mBitsPerChannel    = 8 * bytesPerSample;
    stereoStreamFormat.mSampleRate        = graphSampleRate;
 
 
 
then in the callback pull samples from input
 
AudioUnitRender(rioUnit, ioActionFlags, inTImeStamp, bus1, inNumberFrames, ioData);
 
you can access left and right channel data like this:
 
AudioUnitSampleType *left;
AudioUnitSampleType *right;
 
left = ioData->mBuffers[0].mData;
right = ioData->mBuffers[1].mData;
 
Check out the apple  mixerHost sample code example or: http://zerokidz.com/audiograph
 
Tom
 
 
From: Alex Gross
Sent: Thursday, December 15, 2011 3:16 PM
To: email@hidden
Subject: Stereo recording with remoteIO - how?
 
I developed an iOS low-latency audio app using remoteIO for 3 years now and I thought I knew my way around AudioUnits. Mono recording works fine, but then I tried for 2 days to get stereo recording working and I had no luck. I tested it with an iPad in the Alesis IO Dock and with GuitarJack on an old iPod Touch. GarageBand for iOS offers channel selection and records stereo just fine. All I'm getting is the left channel.
 
Here's my question:
 
In a recording callback
 
static OSStatus recordingCallback(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData)
{
...
AudioBufferList *inputBufferList = calloc(1, offsetof(AudioBufferList, mBuffers) + (sizeof(AudioBuffer)));
inputBufferList->mNumberBuffers = 1;
inputBufferList->mBuffers[0].mNumberChannels = 2;
inputBufferList->mBuffers[0].mDataByteSize = 2 * sizeof(SInt16) * inNumberFrames;
inputBufferList->mBuffers[0].mData = malloc(inputBufferList->mBuffers[0].mDataByteSize);
AudioUnitRender(audioUnit, ioActionFlags, inTimeStamp, 1, inNumberFrames, inputBufferList); // 1 stands for the input bus
...
}
 
where during initialization, the format was set to
 
AudioStreamBasicDescription inputFormat = { 0 };
inputFormat.mSampleRate = 44100;
inputFormat.mFormatID = kAudioFormatLinearPCM;
inputFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
inputFormat.mFramesPerPacket = 1;
inputFormat.mChannelsPerFrame = 2; // of course I detected mono/stereo before and set this to 1 if only mono is available
inputFormat.mBitsPerChannel = 16;
inputFormat.mBytesPerFrame = inputFormat.mChannelsPerFrame * sizeof(SInt16);
inputFormat.mBytesPerPacket = inputFormat.mBytesPerFrame * inputFormat.mFramesPerPacket;
UInt32 inputFormatSize = sizeof(inputFormat);
AudioUnitSetProperty(audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 1, &inputFormat, inputFormatSize); // 1 stands for the input bus
 
How do I retrieve both channels from inputBufferList?
 
 
FAIL 1:
Since I didn't specify kAudioFormatFlagIsNonInterleaved, I expected to get 1 buffer with both channels interleaved. Is the size of the buffer (2 * inNumberFrames * sizeof(SInt16)) or (inNumberFrames * sizeof(SInt16)) ?
 
De-interleaving the data doesn't work:
 
SInt16 *stereoBuffer = malloc(inputBufferList->mBuffers[0].mDataByteSize);
SInt16 *stereoBufferPtr = stereoBuffer;
SInt16 *leftBufferPtr = inputBufferList->mBuffers[0].mData;
SInt16 *rightBufferPtr = inputBufferList->mBuffers[0].mData;
rightBufferPtr += inNumberFrames;
for(UInt32 i = 0; i < inNumberFrames; ++i)
{
*stereoBufferPtr++ = *leftBufferPtr++;
*stereoBufferPtr++ = *rightBufferPtr++;
}
// ... copying stereoBuffer into my output buffer ...
free(stereoBuffer);
 
 
FAIL 2:
I also tried setting kAudioFormatFlagIsNonInterleaved and therefore allocating
 
inputBufferList->mNumberBuffers = 2;
for(UInt32 i = 0; i < inputBufferList->mNumberBuffers; ++i)
{
inputBufferList->mBuffers[0].mNumberChannels = 1;
inputBufferList->mBuffers[0].mDataByteSize = sizeof(SInt16) * inNumberFrames;
inputBufferList->mBuffers[0].mData = malloc(inputBufferList->mBuffers[0].mDataByteSize);
}
 
 
Is it right to assume that I first need to set the kAudioUnitProperty_StreamFormat and then allocate an AudioBufferList, and that both need to be compatible and that they define how AudioUnitRender stores the data into my AudioBufferList?
 
What am I overlooking or doing wrong? Is there a good example or documentation anywhere on the internet? The aurioTouch example didn't help because it does input and output in the same callback. Thanks in advance for any advice!


_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your email sent to email@hidden


 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

  • Follow-Ups:
    • Re: Stereo recording with remoteIO - how?
      • From: Doug Wyatt <email@hidden>
References: 
 >Stereo recording with remoteIO - how? (From: Alex Gross <email@hidden>)
 >Re: Stereo recording with remoteIO - how? (From: tom zicarelli <email@hidden>)

  • Prev by Date: How to get the MIDI CC in a AU Instrument?
  • Next by Date: Re: USB Audio driver and THREAD_TIME_CONSTRAINT_POLICY
  • Previous by thread: Re: Stereo recording with remoteIO - how?
  • Next by thread: Re: Stereo recording with remoteIO - how?
  • Index(es):
    • Date
    • Thread